Entropy vs predictability in PRNGs
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
What's the relationship between entropy and predictability?
random-number-generator entropy
$endgroup$
add a comment |
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
What's the relationship between entropy and predictability?
random-number-generator entropy
$endgroup$
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
2
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago
add a comment |
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
What's the relationship between entropy and predictability?
random-number-generator entropy
$endgroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
What's the relationship between entropy and predictability?
random-number-generator entropy
random-number-generator entropy
edited 13 hours ago
Bastien
asked 18 hours ago
BastienBastien
784
784
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
2
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago
add a comment |
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
2
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
2
2
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago
add a comment |
5 Answers
5
active
oldest
votes
$begingroup$
entropy is the measure of surprise
That's informal, short and non-quantitative, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the outputs of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).
The entropy considered here is a property of the generator, not that of one particular bitstring that it outputs.
Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.
For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.
the Mersenne Twister(MT) has high entropy.
No, because it is a PRNG (see above).
the MT is also predictable.
Yes, with enough output, and little computing power.
What's the relationship between entropy and predictability?
If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.
The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.
To make things quantitative: consider a generator $G$ of arbitrarily long bitstrings. Note $G_b$ that generator restricted to its first $b$ bits. $G_b$ is susceptible to generate $2^b$ different $b$-bit bitstrings $B$ with respective probability $p_B$ (possibly $0$ for some $B$), with $1=sum p_B$ (the sum being over $2^b$ terms $B$). $G_b$ has Shannon entropy in bits
$$H_b(G)=sum_{Btext{ with }p_Bne0}p_B,log_2left(frac1{p_B}right)$$
and its average entropy per bit is $H_b(G)/b$.
For an ideal TRNG (which output is uniformly random independent bits), and any $b$, all $b$-bit bitstrings are equiprobable with $p_B=2^{-b}$ and it comes $H_b(G)=b$.
For any PRNGs with $s$-bit seed (per any distribution), $H_b(G)le max(b,s)$. The entropy of the generator's whole output is $H(G)=displaystylelim_{btoinfty}H_b(G)$. That is maximal when the seed is uniformly random, with $H(G)le s$, and $H(G)=s$ when all seeds generate different outputs.
$endgroup$
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
15 hours ago
add a comment |
$begingroup$
For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.
(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.
If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).
$endgroup$
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, soif a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
|
show 1 more comment
$begingroup$
PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.
Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.
$endgroup$
add a comment |
$begingroup$
...a given PRNG has a uniform distribution, then the entropy would be high.
In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.
So the Mersenne Twister(MT) has high entropy.
Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.
What's the relationship between entropy and predictability?
In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "281"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
entropy is the measure of surprise
That's informal, short and non-quantitative, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the outputs of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).
The entropy considered here is a property of the generator, not that of one particular bitstring that it outputs.
Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.
For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.
the Mersenne Twister(MT) has high entropy.
No, because it is a PRNG (see above).
the MT is also predictable.
Yes, with enough output, and little computing power.
What's the relationship between entropy and predictability?
If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.
The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.
To make things quantitative: consider a generator $G$ of arbitrarily long bitstrings. Note $G_b$ that generator restricted to its first $b$ bits. $G_b$ is susceptible to generate $2^b$ different $b$-bit bitstrings $B$ with respective probability $p_B$ (possibly $0$ for some $B$), with $1=sum p_B$ (the sum being over $2^b$ terms $B$). $G_b$ has Shannon entropy in bits
$$H_b(G)=sum_{Btext{ with }p_Bne0}p_B,log_2left(frac1{p_B}right)$$
and its average entropy per bit is $H_b(G)/b$.
For an ideal TRNG (which output is uniformly random independent bits), and any $b$, all $b$-bit bitstrings are equiprobable with $p_B=2^{-b}$ and it comes $H_b(G)=b$.
For any PRNGs with $s$-bit seed (per any distribution), $H_b(G)le max(b,s)$. The entropy of the generator's whole output is $H(G)=displaystylelim_{btoinfty}H_b(G)$. That is maximal when the seed is uniformly random, with $H(G)le s$, and $H(G)=s$ when all seeds generate different outputs.
$endgroup$
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
add a comment |
$begingroup$
entropy is the measure of surprise
That's informal, short and non-quantitative, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the outputs of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).
The entropy considered here is a property of the generator, not that of one particular bitstring that it outputs.
Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.
For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.
the Mersenne Twister(MT) has high entropy.
No, because it is a PRNG (see above).
the MT is also predictable.
Yes, with enough output, and little computing power.
What's the relationship between entropy and predictability?
If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.
The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.
To make things quantitative: consider a generator $G$ of arbitrarily long bitstrings. Note $G_b$ that generator restricted to its first $b$ bits. $G_b$ is susceptible to generate $2^b$ different $b$-bit bitstrings $B$ with respective probability $p_B$ (possibly $0$ for some $B$), with $1=sum p_B$ (the sum being over $2^b$ terms $B$). $G_b$ has Shannon entropy in bits
$$H_b(G)=sum_{Btext{ with }p_Bne0}p_B,log_2left(frac1{p_B}right)$$
and its average entropy per bit is $H_b(G)/b$.
For an ideal TRNG (which output is uniformly random independent bits), and any $b$, all $b$-bit bitstrings are equiprobable with $p_B=2^{-b}$ and it comes $H_b(G)=b$.
For any PRNGs with $s$-bit seed (per any distribution), $H_b(G)le max(b,s)$. The entropy of the generator's whole output is $H(G)=displaystylelim_{btoinfty}H_b(G)$. That is maximal when the seed is uniformly random, with $H(G)le s$, and $H(G)=s$ when all seeds generate different outputs.
$endgroup$
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
add a comment |
$begingroup$
entropy is the measure of surprise
That's informal, short and non-quantitative, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the outputs of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).
The entropy considered here is a property of the generator, not that of one particular bitstring that it outputs.
Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.
For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.
the Mersenne Twister(MT) has high entropy.
No, because it is a PRNG (see above).
the MT is also predictable.
Yes, with enough output, and little computing power.
What's the relationship between entropy and predictability?
If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.
The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.
To make things quantitative: consider a generator $G$ of arbitrarily long bitstrings. Note $G_b$ that generator restricted to its first $b$ bits. $G_b$ is susceptible to generate $2^b$ different $b$-bit bitstrings $B$ with respective probability $p_B$ (possibly $0$ for some $B$), with $1=sum p_B$ (the sum being over $2^b$ terms $B$). $G_b$ has Shannon entropy in bits
$$H_b(G)=sum_{Btext{ with }p_Bne0}p_B,log_2left(frac1{p_B}right)$$
and its average entropy per bit is $H_b(G)/b$.
For an ideal TRNG (which output is uniformly random independent bits), and any $b$, all $b$-bit bitstrings are equiprobable with $p_B=2^{-b}$ and it comes $H_b(G)=b$.
For any PRNGs with $s$-bit seed (per any distribution), $H_b(G)le max(b,s)$. The entropy of the generator's whole output is $H(G)=displaystylelim_{btoinfty}H_b(G)$. That is maximal when the seed is uniformly random, with $H(G)le s$, and $H(G)=s$ when all seeds generate different outputs.
$endgroup$
entropy is the measure of surprise
That's informal, short and non-quantitative, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the outputs of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).
The entropy considered here is a property of the generator, not that of one particular bitstring that it outputs.
Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.
For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.
the Mersenne Twister(MT) has high entropy.
No, because it is a PRNG (see above).
the MT is also predictable.
Yes, with enough output, and little computing power.
What's the relationship between entropy and predictability?
If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.
The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.
To make things quantitative: consider a generator $G$ of arbitrarily long bitstrings. Note $G_b$ that generator restricted to its first $b$ bits. $G_b$ is susceptible to generate $2^b$ different $b$-bit bitstrings $B$ with respective probability $p_B$ (possibly $0$ for some $B$), with $1=sum p_B$ (the sum being over $2^b$ terms $B$). $G_b$ has Shannon entropy in bits
$$H_b(G)=sum_{Btext{ with }p_Bne0}p_B,log_2left(frac1{p_B}right)$$
and its average entropy per bit is $H_b(G)/b$.
For an ideal TRNG (which output is uniformly random independent bits), and any $b$, all $b$-bit bitstrings are equiprobable with $p_B=2^{-b}$ and it comes $H_b(G)=b$.
For any PRNGs with $s$-bit seed (per any distribution), $H_b(G)le max(b,s)$. The entropy of the generator's whole output is $H(G)=displaystylelim_{btoinfty}H_b(G)$. That is maximal when the seed is uniformly random, with $H(G)le s$, and $H(G)=s$ when all seeds generate different outputs.
edited 13 mins ago
answered 13 hours ago
fgrieufgrieu
78.2k7166331
78.2k7166331
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
add a comment |
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
$endgroup$
– Paul Uszak
9 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
$begingroup$
Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
$endgroup$
– Bastien
8 hours ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
15 hours ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
15 hours ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
New contributor
answered 17 hours ago
IevgeniIevgeni
292
292
New contributor
New contributor
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
15 hours ago
add a comment |
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
15 hours ago
2
2
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.$endgroup$
– Ella Rose♦
15 hours ago
add a comment |
$begingroup$
For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.
(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.
If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).
$endgroup$
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, soif a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
|
show 1 more comment
$begingroup$
For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.
(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.
If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).
$endgroup$
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, soif a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
|
show 1 more comment
$begingroup$
For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.
(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.
If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).
$endgroup$
For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.
(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.
If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).
answered 13 hours ago
supercatsupercat
22114
22114
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, soif a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
|
show 1 more comment
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, soif a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
$endgroup$
– Bastien
13 hours ago
$begingroup$
@EllaRose Yeah, so
if a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.$endgroup$
– PyRulez
12 hours ago
$begingroup$
@EllaRose Yeah, so
if a random process has a uniform distribution, we can say its entropy is high
is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.$endgroup$
– PyRulez
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
$endgroup$
– Ella Rose♦
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
$endgroup$
– supercat
12 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
$begingroup$
@supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
$endgroup$
– fgrieu
11 hours ago
|
show 1 more comment
$begingroup$
PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.
Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.
$endgroup$
add a comment |
$begingroup$
PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.
Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.
$endgroup$
add a comment |
$begingroup$
PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.
Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.
$endgroup$
PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.
Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.
answered 12 hours ago
PyRulezPyRulez
409217
409217
add a comment |
add a comment |
$begingroup$
...a given PRNG has a uniform distribution, then the entropy would be high.
In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.
So the Mersenne Twister(MT) has high entropy.
Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.
What's the relationship between entropy and predictability?
In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.
$endgroup$
add a comment |
$begingroup$
...a given PRNG has a uniform distribution, then the entropy would be high.
In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.
So the Mersenne Twister(MT) has high entropy.
Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.
What's the relationship between entropy and predictability?
In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.
$endgroup$
add a comment |
$begingroup$
...a given PRNG has a uniform distribution, then the entropy would be high.
In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.
So the Mersenne Twister(MT) has high entropy.
Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.
What's the relationship between entropy and predictability?
In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.
$endgroup$
...a given PRNG has a uniform distribution, then the entropy would be high.
In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.
So the Mersenne Twister(MT) has high entropy.
Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.
What's the relationship between entropy and predictability?
In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.
answered 6 hours ago
Paul UszakPaul Uszak
7,21611535
7,21611535
add a comment |
add a comment |
Thanks for contributing an answer to Cryptography Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
15 hours ago
$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
13 hours ago
2
$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
13 hours ago