There are a few crypto APIs that take an io.Reader as a source of random bytes, but that don't commit to how those bytes are used. This caused issues over and over, for example any time we wanted to change the algorithm. These days we both document that they are not deterministic, and use randutil.MaybeReadByte to somewhat enforce it. See #58637.

Now that we have GODEBUGs, it might be time to rip the band-aid off. I propose we start ignoring the random io.Reader parameter of the following APIs, and always use the system random source (crypto/internal/sysrand.Read, not crypto/rand.Reader which may be overridden by the application).

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

Using GODEBUG=cryptocustomrand=1 restores the old behavior. (Suggestions for a better name welcome.) This is a GODEBUG that I would like to remove in a few releases.

rsa.SignPKCS1v15 is not randomized, while rsa.SignPSS and rsa.EncryptOAEP have a fairly well-specified way to use random bytes. Aside from those and ed25519.GenerateKey (see below), I think I listed all APIs in non-deprecated packages that take a random io.Reader.

This might be an issue for the crypto/tls tests, which defuse MaybeReadByte by producing a stream of identical bytes. That's an abuse of GenerateKey anyway, because there is no guarantee that algorithms that expect random inputs will work with constant repeating streams. See for example #70643.

ed25519.GenerateKey is a weird exception in that it is well defined, but also documented to use crypto/rand.Reader if nil is passed. This is annoying because it forces a dependency on crypto/rand and therefore on math/big. We can't just use crypto/internal/sysrand.Read because the user might have overridden crypto/rand.Reader. I am tempted to also propose replacing "crypto/rand.Reader" with "the system random source" but it's probably not worth the risk.

/cc @golang/security

Comment From: gabyhelp

Related Issues

Related Code Changes

Related Documentation

(Emoji vote if this was helpful or unhelpful; more detailed feedback welcome in this discussion.)

Comment From: seankhliao

it seems quite rare to override rand.Reader outside of tests https://github.com/search?q=language%3Ago+%2Frand.Reader+%3D+%2F+-path%3A*_test.go&type=code

Comment From: mateusz834

Are there any real world use-cases where it is currently necessary to provide a different random source, than crypto/rand.Reader? seccomp?

Comment From: FiloSottile

This might be an issue for the crypto/tls tests, which defuse MaybeReadByte by producing a stream of identical bytes. That's an abuse of GenerateKey anyway, because there is no guarantee that algorithms that expect random inputs will work with constant repeating streams. See for example #70643.

This is actually a pretty common requirement in tests, and it feels wrong to get tests forever stuck to a combination of GODEBUG=cryptocustomrand=1 and defusing MaybeReadByte just to end up with breakage any time we change the algorithm.

We discussed a solution with @rsc: let's just acknowledge that tests need this, and that in tests it might make sense to take the tradeoff of losing backwards compatibility, and provide an explicit way to get to the same result, which only works in tests and which is also useful for other things.

package testing

// RandReader returns a Reader that produces a deterministic pseudo-random
// stream based on the seed.
//
// The returned reader can also be used as the rand parameter to
// rsa.GenerateKey, rsa.EncryptPKCS1v15, ecdsa.GenerateKey,
// ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign,
// and ecdh.Curve.GenerateKey to cause them to behave deterministically.
// Note that the output of these functions may and will change across Go versions,
// so any tests using this affordance must be prepared to
// update vectors based on the Go version (e.g. with a build tag).
//
// It can only be called in a test (as reported by [Testing]),
// otherwise it panics.
func RandReader(seed int64) io.Reader

Comment From: aclements

Overall this seems reasonable, though it will be a difficult transition.

We've been trying to keep niche things out of the testing package proper. How about testing/cryptotest? Then we can also call it InsecureRandReader to really hammer home that you shouldn't use this.

Comment From: FiloSottile

I proposed it as a generically named thing because "a reproducible sequence of pseudo-tandom bytes, for testing" is something I've needed quite a few times, not only when testing crypto, and I jury-rigged it with AES-CTR or ChaCha8.

I wonder if that's my selection bias, but I've heard from others that they would have needed this for generic purposes.

With the testing.Testing check and such a small seed, I am not really worried about folks using it thinking it's secure.

Ultimately, no strong opinion, although a new package feels like a big lift.

Comment From: aclements

This proposal has been added to the active column of the proposals project and will now be reviewed at the weekly proposal review meetings. — aclements for the proposal review group

Comment From: aclements

There are a lot of other APIs that take random io.Readers, too, like EncryptOAEP, EncryptPKCS1v15, etc. Though I also see there are a lot of functions that take a random io.Reader but are documented to ignore it. What's your ideal final state here?

It seems really odd that these APIs are going to take an io.Reader that they nearly always ignore. Another option is that we introduce new APIs that don't take the io.Reader and we deprecate the APIs that do. This would let us keep the old APIs working the way they do today, at least for another release or two, but give a strong signal that people need to move off the old APIs. I'd be much more comfortable with further changing the behavior of these APIs if people had a strong signal that this was coming. Though this wouldn't help with the testing issue.

This may be retreading old ground, but another option would be a vet check that requires the reader to be crypto/rand.Reader except in a test.

Can you give more context, or point to a past discussion, on why we decided it was okay to break backwards compatibility for these random sources?

Comment From: neild

I proposed it as a generically named thing because "a reproducible sequence of pseudo-tandom bytes, for testing" is something I've needed quite a few times, not only when testing crypto, and I jury-rigged it with AES-CTR or ChaCha8.

If testing.RandReader is worth adding on its own merits, I think that should be a separate proposal. (My main question there would be: Have we made it too difficult to acquire non-secure bytes out of math/rand/v2? Could this be rand/v2.Rand.InsecureReader?)

Comment From: aclements

Can you give more context, or point to a past discussion, on why we decided it was okay to break backwards compatibility for these random sources?

To clarify my question, I mean that, if I understand correctly, originally we were using the bytes from these readers in a semi-deterministic way: if you always provided the same bytes, you'd get the same results, unless you upgraded Go and there was an algorithm implementation change. At some point we moved to using those bytes but randomly shifting the stream in MaybeReadByte. I'm curious where we had the discussion in which we decided that was okay and not a violation of backwards compatibility.

Comment From: seankhliao

I believe that was #21915 / CL 64451

Comment From: rolandshoemaker

I think generally deprecation should be reserved for times when people should really stop using something, whereas in this case none of the underlying code would change, we'd just be dropping a argument. In an ideal world where we completely redesigned these APIs I think we'd drop the argument, but I don't think there is a reasonable way to do that now (also losing some good names, like GenerateKey, would be an overall UX loss).

One possible additional option is to steal @neild's concept of bubbles from the synctest package, and provide a way within tests to set the global crypto/rand to a deterministic reader. In this option we'd document that these functions always use the global source of randomness (unless you set the relevant GODEBUG). We'd then provide a function (perhaps in cryptotest?) which takes a testing.T and sets the global source of crypto randomness to a deterministic reader, say func Test(t *testing.T, seed int64, f func(*testing.T)), for the current goroutine and all of its children.

All methods which depend on the global source of randomness would then become ~deterministic, allowing for testing without the need to construct a special reader.

The major win here would be (a) the simplicity (from the user perspective, the implementation may be somewhat complex) and (b) that we could explicitly document that the rand argument to various functions is always ignored, and don't need to worry about the semantics of when we use a special value that is passed in or not (and if we need to add checks for people passing in non-test readers etc).

Comment From: aclements

Expanding the "bubble" concept just for this feels like a heavy lift. There may be something there (bubble-local storage? 🤔), but I think it'd need to be much better motivated.

A simpler alternative would be a function like testing.WithCryptoReader(f func()) that simply sets a global deterministic random source for the duration of f, without attempting any sort of bubble-like isolation. There are hazards to that, like it wouldn't work in parallel tests, but there's precedent for such hazards, like T.Chdir.

Comment From: aclements

@rolandshoemaker points out that if we do WithCryptoReader and include the testing.T, it could fail in a parallel test. It could also fail if the global rand is already set to a testing rand. @rolandshoemaker will post a concrete API.

Comment From: rolandshoemaker

// WithCryptoReader runs function f with the global cryptographic random source
// replaced by a deterministic random source, seeded with seed. This can be used
// to test cryptographic APIs when determinism is needed.
//
// Because WithCryptoReader affects the whole process, it cannot be used in parallel
// tests or tests with parallel ancestors.
func WithCryptoReader(t testing.T, seed uint64, f func())

Comment From: qmuntal

@rolandshoemaker what about renaming your WithCryptoReader function to WithReader and placing it in the testing/cryptotest package. This functions looks like too niche to be in testing.

Comment From: rolandshoemaker

Ah yeah, I should've noted that in the comment, I think we'd want this in cryptotest. I think WithRandReader might be a better name, or WithDeterministicGlobalRand to be extremely clear about what it does.

Comment From: aclements

It sounds like we're converging on:

Always ignore the rand io.Reader argument to crypto functions (with a GODEBUG to undo that), and add cryptotest.WithReader(t *testing.T, seed uint64, f func()) to temporarily set a global deterministic random reader for testing. WithReader would have safeguards (e.g., panicking if !testing.Testing()) and documented caveats that changes to crypto implementations can change how they use randomness and thus change their results.


There are also several things that use crypto/rand.Reader, which is a global variable that can in principle be rebound to some other reader. Is the proposal to only change functions that currently take an io.Reader to use sysrand, or to also replace all uses of crypto/rand.Reader and crypto/rand.Read?

Comment From: FiloSottile

There are a lot of other APIs that take random io.Readers, too, like EncryptOAEP, EncryptPKCS1v15, etc. Though I also see there are a lot of functions that take a random io.Reader but are documented to ignore it. What's your ideal final state here?

That's a good question. I'd like us to get to a place where crypto functions are either a well-specified deterministic function of their inputs (which might include a random io.Reader), or unpredictably random.

The former includes EncryptOAEP, where it's obvious how to use the random source: read hash.Size() worth of bytes, and use them as specified.

The latter are the ones covered by this proposal and that currently use MaybeReadByte to try and discourage depending on the unspecified implementation internals. Unfortunately, what that leads to is testing with repeating byte readers, which are very much not random, and so sometimes behave weirdly (like getting into a loop).


The alternative to this proposal is to come up with our own specifications for how to use random bytes in all these functions, and commit to it forever. For RSA, ECDSA, and ECDH key generation I had gotten started on a C2SP specification for deterministic key generation. For ECDSA signing we do currently use an IETF draft. For rsa.EncryptPKCS1v15... we could make something up, no one cares about rsa.EncryptPKCS1v15 anyway.

The problem with this path is that we have to be careful to make/pick a specification that's FIPS-friendly, and can't change our mind later.

It's a bit unfortunate that if we go ahead with this proposal, we can't easily switch to specified behavior later, because in the meantime we promised to ignore the random reader, so applications might have set it to nil or something insecure.


Always ignore the rand io.Reader argument to crypto functions (with a GODEBUG to undo that)

Specifically to the crypto functions listed in the initial issue

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

which are the ones without a clear specification.

cryptotest.WithReader(t *testing.T, seed uint64, f func())

cryptotest.WithRandom feels like a better name, since it changes the system random source, whether a reader is involved or not.

There are also several things that use crypto/rand.Reader, which is a global variable that can in principle be rebound to some other reader. Is the proposal to only change functions that currently take an io.Reader to use sysrand, or to also replace all uses of crypto/rand.Reader and crypto/rand.Read?

If the question is whether cryptotest.WithRandom also affects crypto/rand.Read and the default crypto/rand.Reader, I would say yes. Under cryptotest.WithRandom, everything that would have come from the system CSPRNG comes from this instead.

If the question is whether beyond changing functions that take an explicit Reader we also change functions that default to crypto/rand.Reader which might have been overridden... AFAICT, that's only crypto/tls.Config.Rand and ed25519.GenerateKey.

This is what I said about the latter

ed25519.GenerateKey is a weird exception in that it is well defined, but also documented to use crypto/rand.Reader if nil is passed. This is annoying because it forces a dependency on crypto/rand and therefore on math/big. We can't just use crypto/internal/sysrand.Read because the user might have overridden crypto/rand.Reader. I am tempted to also propose replacing "crypto/rand.Reader" with "the system random source" but it's probably not worth the risk.

As for crypto/tls.Config.Rand, we should probably deprecate it outright in favor of cryptotest.WithRandom, since setting only Config.Rand will only partially derandomize the connection after this, while cryptotest.WithRandom will derandomize everything, both the direct reads from crypto/rand.Reader for nonces etc. and the indirect uses of sysrand.

Comment From: aclements

If the question is whether beyond changing functions that take an explicit Reader we also change functions that default to crypto/rand.Reader which might have been overridden... AFAICT, that's only crypto/tls.Config.Rand and https://github.com/golang/go/commit/ed2551978a9c5037027f96f1658aee307fd5c026.GenerateKey.

That was my question, yes. But looking again, I must have messed up my search before. I thought I had found several places that used crypto/rand.Read or crypto/rand.Reader, but now the only places I'm finding outside of tests are the two places you mentioned. That's great because it makes my concern moot. :)

Comment From: aclements

The callback version makes it easy to think this is more tightly scoped than it really is. Let's instead make it more obvious this change global state, much like T.Chdir and T.Setenv:

package cryptotest

// SetGlobalRandom sets a global, deterministic cryptographic randomness source
// for the duration of test t.
//
// SetGlobalRandom may be called multiple times in the same test to reset the
// random stream or change the seed.
//
// Because SetGlobalRandom affects the whole process,
// it cannot be used in parallel tests or tests with parallel ancestors.
//
// Note that the way cryptographic algorithms use randomness is
// generally not specified and may change over time. Thus, if a test
// expects a specific output from a cryptographic function, it may fail
// in the future even if it uses SetGlobalRandom.
func SetGlobalRandom(t *testing.T, seed uint64)

Comment From: aclements

Have all remaining concerns about this proposal been addressed?

The proposal is to start ignoring the random io.Reader parameter of most crypto APIs, and always use the system random source (crypto/internal/sysrand.Read). Specifically, the following APIs would be covered:

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

A new GODEBUG=cryptocustomrand=1 would restore the old behavior.

For testing with a deterministic random source, we would add a testing/cryptotest package with the following API:

```go package cryptotest

// SetGlobalRandom sets a global, deterministic cryptographic randomness source // for the duration of test t. // // SetGlobalRandom may be called multiple times in the same test to reset the // random stream or change the seed. // // Because SetGlobalRandom affects the whole process, // it cannot be used in parallel tests or tests with parallel ancestors. // // Note that the way cryptographic algorithms use randomness is // generally not specified and may change over time. Thus, if a test // expects a specific output from a cryptographic function, it may fail // in the future even if it uses SetGlobalRandom. func SetGlobalRandom(t *testing.T, seed uint64)```

Comment From: marten-seemann

If I understand this proposal correctly, it would break an important use case that I deployed at a previous job. It's a bit involved, and I'll try to explain it as concisely as possible.

WebTransport is new transport protocol that is currently being standardized by the IETF, with draft versions are already shipped in Chrome and Firefox. One interesting feature of the W3C API is serverCertificateHashes (w3c documentation, recently concluded discussion), which allows a client (browser) to connect to a server that uses a self-signed TLS certificate, as long as the hash of the certificate is known in advance, and the validity of the certificate is less than 14 days.

Here's how one could deploy this: The WebTransport server could publish the hash of its current certificate (and potentially the hash of the certificate for the next 14-day period) out-of-band, and the client would retrieve this hash before dialing the WebTransport connection. It's a property of many address resolution systems that they exhibit a propagation delay: for example, if one were to put the certificate hash into a DNS TXT record, it might take a few hours until the new value is consistently returned by all DNS resolvers. In my deployment scenario, the value is stored in a DHT, but the same principle applies: multiple layers of caching slow down the propagation of updates.

In order to recover from crashes / reboots, a WebTransport server therefore wishes to deterministically generate its certificates. This is done by deriving the key from a (single) long-term secret stored on disk, and then deriving a deterministically random byte stream using an HKDF expansion. This is then the input to ecdsa.GenerateKey (code).


Another use case may be cryptocurrency wallets. I can't claim any expertise on this topic, but my understanding is that HD wallets use a tree structure where keys (for different chains, and for different sub-wallets on the same chain) are generated from the 24-word seed phrase. It might be helpful if someone more familiar with this topic could comment on this.

Comment From: FiloSottile

Deterministic key generation absolutely has a number of legitimate uses. However, it's already (as in, before this proposal) not supported by the standard library: the ecdsa.GenerateKey docs say "the returned key does not depend deterministically on the bytes read from rand, and may change between calls and/or between versions" and the first thing it does is call MaybeReadByte. I am actually very confused how that code works post-Go 1.19.

I made filippo.io/keygen to provide deterministic ECDSA key generation for the use cases that need it (and it includes an ECDSALegacy mechanism that matches Go 1.19 for those already stuck on it).

The ECDSA keygen case is actually a great example of the motivation for this proposal: the old method to generate a key turned out to be way way more annoying to implement without math/big, so when we moved crypto/ecdsa to safer, constant-time backend we needed the ability to change the process.

Comment From: marten-seemann

I am actually very confused how that code works post-Go 1.19.

I have the feeling that you'll tell me "you really should not do this", but it's possible to work around this: https://github.com/libp2p/go-libp2p/blob/v0.42.0/p2p/transport/webtransport/crypto.go#L139-L164

I made filippo.io/keygen to provide deterministic ECDSA key generation for the use cases that need it (and it includes an ECDSALegacy mechanism that matches Go 1.19 for those already stuck on it).

That's totally fair, and I think your proposal would be an easier sell if there was a well-maintained, well-documented way to deterministically generate keys. Is it your intention to add other key types to that repository as well? Will algorithmic improvements (e.g. fixes for side-channels) be backported there from the standard library?

Comment From: FiloSottile

I have the feeling that you'll tell me "you really should not do this", but it's possible to work around this: https://github.com/libp2p/go-libp2p/blob/v0.42.0/p2p/transport/webtransport/crypto.go#L139-L164

Aaaaaaa. You really should not do this :)

That's totally fair, and I think your proposal would be an easier sell if there was a well-maintained, well-documented way to deterministically generate keys. Is it your intention to add other key types to that repository as well? Will algorithmic improvements (e.g. fixes for side-channels) be backported there from the standard library?

Yeah that's the plan. I should probably tag v1.0.0 before we land this. (Fixes can be ported to the extent they don't require changing the algorithm, which is why ECDSALegacy exists and still uses math/big.)

Comment From: MarcoPolo

I made filippo.io/keygen to provide deterministic ECDSA key generation for the use cases that need it (and it includes an ECDSALegacy mechanism that matches Go 1.19 for those already stuck on it).

I think this satisfies our use case in go-libp2p, especially if you plan on releasing v1.0.0 so we can avoid churn in the output of the ECDSA function.

Comment From: FiloSottile

so we can avoid churn in the output of the ECDSA function

Neither keygen.ECDSA nor keygen.ECDSALegacy match or will match Go 1.20+ w/o MaybeReadByte.

Sorry, but you went far out of your way to defeat a safety meant to stop you from creating this problem for yourself. You have succeeded and you now have the problem.

Comment From: MarcoPolo

Understood. My point was to avoid churn in the output of keygen.ECDSA between pre-v1.0.0 and v1.0.0.

The changes from ecdsa.GenerateKey to keygen.ECDSA is unavoidable as you say, but that's okay. I'm very happy to get rid of that hack at the cost of some failed WebTransport connections for a short period of time (however long it takes for clients and the DHT to update their state, likely less than an hour).

I make no defense for that hack. It was bad and I should have looked for the keygen package sooner.

Comment From: MarcoPolo

I realize I also need deterministic ECDSA signatures. Our usecase relies on creating self-signed x.509 certificates deterministically. That involves a ECDSA key generation as discussed above, but also a ECDSA signature. We do this today with that ugly hack. With this proposal, which I favor, I think the only pragmatic alternative is to store more state on disk. But why can't the ecdsa signature be deterministic? I thought it was well defined by RFC 6979.

Comment From: marten-seemann

Sorry, but you went far out of your way to defeat a safety meant to stop you from creating this problem for yourself. You have succeeded and you now have the problem.

I'd like to push back on this framing a bit. What we implemented for WebTransport is a cryptographically sound (as far as I can tell) solution to an actual problem we had to solve. This code was written in Nov 22, ~4 months before the first commit to filippo.io/keygen. We did acknowledge in a comment that this was a hack, but there simply weren't any good options available: Maintaining a fork of ECDSA would have been supoptimal as well...

I appreciate the efforts to make the API more secure and less error-prone for the common case. On the other hand, there are valid use cases for deterministic key and signatures, and if done carefully, this can be used in a secure way. It would make Go a less useful language if there wasn't a way to support these use cases (without forking low-level crypto primitives).

My preference would be to have a deterministic version of these functions in the standard library (GenerateDeterministicKey or GenerateKeyWithReader?). Having them in a separate repository like filippo.io/keygen is a workable solution as well, as long as there's a long-term maintenance commitment. As a third option, has any thought been given to moving the deterministic variant of the functions to somewhere in golang.org/x?

Comment From: FiloSottile

Understood. My point was to avoid churn in the output of keygen.ECDSA between pre-v1.0.0 and v1.0.0.

The changes from ecdsa.GenerateKey to keygen.ECDSA is unavoidable as you say, but that's okay.

Oh, I misunderstood what you were asking, sorry. Waiting for v1.0.0 to avoid churning keygen.ECDSA output is perfectly reasonable. I haven't tagged v1.0.0 because I want to first write up a spec for the process (C2SP/C2SP#114).

I realize I also need deterministic ECDSA signatures.

I have good news for you as of Go 1.24! :)

I'd like to push back on this framing a bit. What we implemented for WebTransport is a cryptographically sound (as far as I can tell) solution to an actual problem we had to solve. [...]

I don't want to take this discussion too far off topic, in particular since this branch is based on my misunderstanding above.

I agree deterministic key generation has its uses. As of Go 1.24, there is no way to do it in the standard library (because your hack is cryptographically sound but not stable: we could still change the MaybeReadByte size, or the keygen algorithm), so this proposal doesn't change that. We can always consider a future proposal to add it, potentially based on keygen.ECDSA if that turns out to be a good idea and a good spec.

Comment From: FiloSottile

func SetGlobalRandom(t *testing.T, seed uint64)

Should this be a testing.TB? I can imagine wanting to make the randomness of benchmarks deterministic.

Comment From: neild

I don’t think it makes sense to replace the global random source in a benchmark. Benchmarks should generally exercise as close to the production behavior of code as possible, which means using the system random source rather than a test-specific one.

If a benchmark’s run time varies based on the randomness produced by crypto/rand, that sounds like a timing oracle and a bit of a problem.

Comment From: FiloSottile

If a benchmark’s run time varies based on the randomness produced by crypto/rand, that sounds like a timing oracle and a bit of a problem.

Not necessarily if the rejected random bytes don't get incorporated in the output. For an extreme case, see RSA key generation.

Comment From: aclements

Is deterministic key generation a more general problem in benchmarking or just this one benchmark? Are there other examples? It seems like it's just this one benchmark, in which case I'm inclined to agree with @neild 's logic.

Comment From: aclements

Have all remaining concerns about this proposal been addressed?

The proposal is to start ignoring the random io.Reader parameter of most crypto APIs, and always use the system random source (crypto/internal/sysrand.Read). Specifically, the following APIs would be covered:

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

A new GODEBUG=cryptocustomrand=1 would restore the old behavior.

For testing with a deterministic random source, we would add a testing/cryptotest package with the following API:

```go package cryptotest

// SetGlobalRandom sets a global, deterministic cryptographic randomness source // for the duration of test t. // // SetGlobalRandom may be called multiple times in the same test to reset the // random stream or change the seed. // // Because SetGlobalRandom affects the whole process, // it cannot be used in parallel tests or tests with parallel ancestors. // // Note that the way cryptographic algorithms use randomness is // generally not specified and may change over time. Thus, if a test // expects a specific output from a cryptographic function, it may fail // in the future even if it uses SetGlobalRandom. func SetGlobalRandom(t *testing.T, seed uint64)```

Comment From: aclements

Based on the discussion above, this proposal seems like a likely accept. — aclements for the proposal review group

The proposal is to start ignoring the random io.Reader parameter of most crypto APIs, and always use the system random source (crypto/internal/sysrand.Read). Specifically, the following APIs would be covered:

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

A new GODEBUG=cryptocustomrand=1 would restore the old behavior.

For testing with a deterministic random source, we would add a testing/cryptotest package with the following API:

```go package cryptotest

// SetGlobalRandom sets a global, deterministic cryptographic randomness source // for the duration of test t. // // SetGlobalRandom may be called multiple times in the same test to reset the // random stream or change the seed. // // Because SetGlobalRandom affects the whole process, // it cannot be used in parallel tests or tests with parallel ancestors. // // Note that the way cryptographic algorithms use randomness is // generally not specified and may change over time. Thus, if a test // expects a specific output from a cryptographic function, it may fail // in the future even if it uses SetGlobalRandom. func SetGlobalRandom(t *testing.T, seed uint64)```

Comment From: aclements

No change in consensus, so accepted. 🎉 This issue now tracks the work of implementing the proposal. — aclements for the proposal review group

The proposal is to start ignoring the random io.Reader parameter of most crypto APIs, and always use the system random source (crypto/internal/sysrand.Read). Specifically, the following APIs would be covered:

  • rsa.GenerateKey and rsa.GenerateMultiPrimeKey
  • rsa.EncryptPKCS1v15
  • ecdsa.GenerateKey
  • ecdsa.SignASN1, ecdsa.Sign, and ecdsa.PrivateKey.Sign
  • ecdh.Curve.GenerateKey

A new GODEBUG=cryptocustomrand=1 would restore the old behavior.

For testing with a deterministic random source, we would add a testing/cryptotest package with the following API:

```go package cryptotest

// SetGlobalRandom sets a global, deterministic cryptographic randomness source // for the duration of test t. // // SetGlobalRandom may be called multiple times in the same test to reset the // random stream or change the seed. // // Because SetGlobalRandom affects the whole process, // it cannot be used in parallel tests or tests with parallel ancestors. // // Note that the way cryptographic algorithms use randomness is // generally not specified and may change over time. Thus, if a test // expects a specific output from a cryptographic function, it may fail // in the future even if it uses SetGlobalRandom. func SetGlobalRandom(t *testing.T, seed uint64)```