UPDATE: 2024-07-29: the latest proposal can be found here.


Background

The current best-practice to track tool dependencies for a module is to add a tools.go file to your module that includes import statements for the tools of interest. This has been extensively discussed in #25922 and is the recommended approach in the Modules FAQ

This approach works, but managing the tool dependencies still feels like a missing piece in the go mod toolchain. For example, the instructions for getting a user set up with a new project using gqlgen (a codegen tool) looks like this

# Initialise a new go module
mkdir example
cd example
go mod init example

# Add gqlgen as a tool
printf '// +build tools\npackage tools\nimport _ "github.com/99designs/gqlgen"' | gofmt > tools.go
go mod tidy

# Initialise gqlgen config and generate models
go run github.com/99designs/gqlgen init

The printf line above really stands out as an arbitrary command to "add a tool" and reflects a poor developer experience when managing tools. For example, an immediate problem is that the printf line will only work on unix systems and not windows. And what happens if tools.go already exists?

So while we have some excellent tools for managing dependencies within the go.mod file using go get and go mod edit, there is no such equivalent for managing tools in the tools.go file.

Proposed Solution

The go.mod file uses the // indirect comment to track some dependencies. An // indirect comment indicates that no package from the required module is directly imported by any package in the main module (source).

I propose that this same mechanism be used to add tool dependencies, using a // tool comment.

Users could add a tool with something like

go get -tool github.com/99designs/gqlgen@v0.14.0

or

go mod edit -require=github.com/99designs/gqlgen -tool

A go.mod would then look something like

module example

go 1.17

require (
    github.com/99designs/gqlgen v0.14.0 // tool
)

And would allow users to subsequently run the tool with go run github.com/99designs/gqlgen

This would mean a separate tools.go file would no longer be required as the tool dependency is tracked in the go.mod file.

Go modules would be "tool" aware. For example: - go mod tidy would not remove the // tool dependency, even though it is not referenced directly in the module - Perhaps if a module with a // tool dependency is imported by another module, Go modules understands that the // tool dependency is not required as an indirect dependency. Currently when using tools.go, go modules does not have that context and the tool is treated like any other indirect dependency - go get -tool [packages] would only add a dependency with a main package

Comment From: fsouza

I like this, I find it annoying to use the tools.go solution, though I'll admit I don't have a better complaint than it being annoying/weird.

If this proposal moves forward, where does the dependency go in the go.mod file? (assuming the 1.17 format with multiple require blocks). Will it have a dedicated block for tools? Or are tools treated like // indirect and placed in the same block?

Comment From: ianlancetaylor

CC @bcmills @jayconrod

Comment From: mtibben

If this proposal moves forward, where does the dependency go in the go.mod file? (assuming the 1.17 format with multiple require blocks). Will it have a dedicated block for tools? Or are tools treated like // indirect and placed in the same block?

Good question! I'm not so familiar with the reasoning behind the multiple blocks... something to do with lazy loading? I'd defer to those with more experience in this area

Comment From: mvdan

Personally, I think https://github.com/golang/go/issues/42088 is already a pretty good solution. With it, one can write go generate lines like:

//go:generate go run golang.org/x/tools/cmd/stringer@1a7ca93429 -type=Foo

Similarly, go run pkg@version can be used in scripts, makefiles, and so on. Plus, it doesn't even require a go.mod file to be used; you can use this method anywhere, just like go install pkg@version.

Another big advantage is that you can pick specific versions of tools, and they won't interfere with your main go.mod module dependency graph. Perhaps I want to use a generator that pulls in an unstable master version of a library that my project also uses, and I don't want my project to be forced into using the same newer unstable version.

Comment From: mvdan

The only downside to #42088 is that, if you repeat the same go run pkg@version commands across multiple files, it can get a bit repetitive. Luckily, you have multiple solutions at hand: sed scripts to keep the versions in sync, short script files to deduplicate the commands, or even a module-aware tool that could sync go run pkg@version strings with a go.mod file, if you wanted to do that.

Comment From: seankhliao

Or GOBIN=local-dir go install pkg@version, always run from the local directory and not clobber whatever version the user may have globally installed. I think it would be a mistake for modules to implicitly rely on shared mutable global bin dir for a first class workflow

Comment From: mtibben

Oh interesting, thanks @mvdan I wasn't aware of that solution. πŸ€”

A few concerns immediately come to mind...

  1. You mean go run hack.me/now@v1.0.0 will just download and run some random go code 😱 That is slightly unexpected to me, equivalent to a curl | bash command. My assumption was always that go run ran local code or modules already specified in go.mod, but seems that assumption is incorrect

  2. Should gqlgen instructions always be to specify version with go run github.com/99designs/gqlgen@0.14.0? That seems verbose

  3. Repetition across multiple files, keeping version in sync, yep your comment above nails it

Comment From: mtibben

Also this go run solution should probably be added to the Go Modules FAQ if this is now considered best-practice for go:generate tools

Comment From: mvdan

In module mode, go run can always download, build, and run arbitrary code. The difference between go run pkg relying on go.mod and go run pkg@version is how you specify the version and how it's verified. With a go.mod, you are forced into a specific version recorded in go.mod and go.sum. Without one, it's up to you what version you specify; @master is obviously risky, @full-commit-hash is safest, and @v1.2.3 is a middle ground that would probably be best for most people. Even if a malicious upstream rewrites a tag to inject code, GOPROXY and GOSUMDB should protect you from that.

Comment From: mvdan

Also this go run solution should probably be added to the Go Modules FAQ if this is now considered best-practice for go:generate tools

It certainly warrants a mention. I'm not sure we should bless it as the only best practice, though, because there can be legitimate reasons for versioning, downloading, and running tools some other way. Perhaps some of your tools aren't written in Go, such as protoc, so you use a "tool bundler" that's entirely separate to Go. Or perhaps you do need your tools to share the same MVS graph with your main module for proper compatibility, so you want them to share a go.mod file.

Comment From: mtibben

Gotta say though... go run pkg@version seems like a massive security footgun to me.

go install I understand well that it can download code from a remote location and build a binary. It's not obvious at all that go run directly executes code from a remote location, and I wonder how widely that is understood.

Comment From: mtibben

So even with the go run pkg@version approach, I still think this proposal has value for specifying tool dependency versions in the context of a module. This approach avoids requiring a tools.go file (as with the existing best-practice), and avoids specifying the tool version for every file that uses it (with the go run approach)

Comment From: lwc

Also worth noting: codegen tools like gqlgen and protobuf are often comprised of a generator command and a runtime, both of which typically need to be versioned in lock-step.

This proposal solves that case rather neatly, allowing go.mod to manage both generator and runtime versions.

Comment From: fsouza

Personally, I think #42088 is already a pretty good solution. With it, one can write go generate lines like:

//go:generate go run golang.org/x/tools/cmd/stringer@1a7ca93429 -type=Foo

Similarly, go run pkg@version can be used in scripts, makefiles, and so on. Plus, it doesn't even require a go.mod file to be used; you can use this method anywhere, just like go install pkg@version.

We used to do that. Then people would have that replicated across different files and the version wouldn't always match, and we wanted to automate tool updating, so we figured that migrating to tools.go + having everything in go.mod would be better for compatibility with the ecosystem built around go modules (vs rolling our own tool to keep modules used directly in //go:generate up to date).

Again, tools.go works, but it's weird (not very scientific, I know πŸ™ˆ). I think this proposal makes version management of tools better because it enables people to manage them using solely go commands (vs things like the bash oneliner shared by the OP).

Comment From: bcmills

@jayconrod has previously suggested something similar, using a new directive (perhaps tool?) instead of a // tool comment.

Personally, I prefer the approach of adding a new directive β€” today we do treat requirements with // indirect comments a bit specially in terms of syntax, but they are semantically still just comments, and I would rather keep them that way at least to the extent possible.

A new tool directive, on the other hand, would allow us to preserve the existing semantics of go mod tidy without special treatment for // tool comments.

Comment From: mvdan

@bcmills would such tool requirements be part of the same MVS module graph?

Comment From: bcmills

The tool directive would list package paths (not module requirements), and the named packages would be treated as if imported in a .go source file in the main module.

In particular: * go mod tidy would ensure that the packages transitively imported by the named package (and its test) can be resolved from the module graph. * go mod vendor would copy the packages transitively imported by the name package into the vendor directory (but would omit its test code and dependencies as usual). * go list direct (#40364) would report the named packages as direct imports.

Comment From: carldunham

Or go list tools

Comment From: jayconrod

I like this proposal. I've had something similar in my drafts folder for a while. @bcmills touched on the main difference. go.mod would have a tool directive that would name the full package path for the tool. You'd still need a separate require directive for the containing module, and that would be treated like a normal require directive by MVS.

module example.com/use

go 1.18

require golang.org/x/tools v0.1.6

tool golang.org/x/tools/cmd/stringer

I don't think go run tool@version and go install tool@version completely replace go run tool and go install tool. When the @version suffix is used, it ignores the go.mod file for the current module. That's useful most of the time, but not if you want to track your tool dependencies together with other dependencies, or if you want to use a patched version of a tool (applying replace directives).

Comment From: mtibben

Yeah I like the tool directive. There might be a couple of tradeoffs with compatibility with older go versions. A tool directive wouldn't be recognised by older go versions, and presumably ignored. A require directive with // tool would be recognised, but would be removed by a go mod tidy.

A tool directive would keep the dependency tree separate - as they should be. For example, I don't think indirect dependencies would need to be tracked for tools, or shared by the module. Essentially a tool directive would specify a version when running go run tool instead of needing go run tool@version

Comment From: mtibben

Or have I got that wrong? Is sharing indirect dependencies between tools and other dependencies a desirable feature?

Comment From: jayconrod

A tool directive wouldn't be recognised by older go versions, and presumably ignored. A require directive with // tool would be recognised, but would be removed by a go mod tidy.

Right. The go command reports errors for unknown directives in the main module's go.mod file, but it ignores unknown directives in dependencies' go.mod files. So everyone working on a module that used this would need to upgrade to a version of Go that supports it (same as most other new features), but their users would be unaffected.

A tool directive would keep the dependency tree separate - as they should be. For example, I don't think indirect dependencies would need to be tracked for tools, or shared by the module. Essentially a tool directive would specify a version when running go run tool instead of needing go run tool@version

Or have I got that wrong? Is sharing indirect dependencies between tools and other dependencies a desirable feature?

My suggestion is to have tool act as a disembodied import declaration: it's just in go.mod instead of tools.go. You'd still need a require directive for the module providing the tool, and it would be treated as a regular requirement by go mod tidy and everything else.

If you don't want to mix tool and library dependencies in go.mod, it's probably better to either use go run tool@version or to have a separate tools.mod file, then go run -modfile=tools.mod tool.

Comment From: mtibben

Yep that makes a lot of sense @jayconrod

Comment From: rsc

This proposal has been added to the active column of the proposals project and will now be reviewed at the weekly proposal review meetings. β€” rsc for the proposal review group

Comment From: mtibben

@jayconrod Did you want to write up the tool directive approach that we could incorporate as an option into this proposal? I'm happy to collaborate on it with you. Positive feedback on that approach so far in this thread, and it would be good to compare the options directly against each other, now that this proposal will be considered by the go-powers-that-be

Comment From: jayconrod

Sure, I'll paste my draft proposal below. Unfortunately I won't be able to work on the implementation for this, but this is what I was thinking in terms of design.


tool directive

I propose adding a tool directive to the go.mod file format. Each tool directive names a package.

tool golang.org/x/tools/cmd/stringer

go mod tidy and go mod vendor would act as if each tool package is imported by a package in the main module. Tool packages would be matched by the all metapackage.

Modules providing tool packages must still be required with require directives. Requirements for tools would not be treated differently from other requirements. This means that if a command and a library are needed from the same module, they must be at the same version (related: #33926). Requirements on modules providing tools would also affect version selection in dependent modules if lazy loading is not enabled.

The tool directive itself would not affect version selection. go mod tidy, go mod vendor, and other commands would ignore tool directives outside the main module.

go get -tool

tool directives could be added or removed with go get, using the -tool flag. For example:

go get -tool golang.org/x/tools/cmd/stringer@v0.1.0

The command above would add a tool directive to go.mod if one is not already present. It would also add or update the requirement on golang.org/x/tools and any other modules implied by that update.

require golang.org/x/tools v0.1.0
tool golang.org/x/tools/cmd/stringer

A tool directive could be removed using the @none version suffix.

go get -tool golang.org/x/tools/cmd/stringer@none

-tool could be used with -u and -t.

tools metapackage

To simplify installation, go install and other commands would support a new metapackage, tools, which would match packages declared with tool dependencies.

# Install all tools in GOBIN
go install tools

# Install all tools in the bin/ directory
go build -o bin/ tools

# Update all tools to their latest versions.
go get tools

It would not be an error for a tool directive to refer to a package in the main module, so the tools metapackage could match a mix of local and external commands.

Comment From: mtibben

Modules providing tool packages must still be required with require directives. Requirements for tools would not be treated differently from other requirements. This means that if a command and a library are needed from the same module, they must be at the same version (related: #33926). Requirements on modules providing tools would also affect version selection in dependent modules if lazy loading is not enabled.

Ah yes this is similar to the // tools approach which wouldn't allow a different version between the tool binary and the library to be specified.

Specifying the actual location of the binary isn't something the // tools approach solves - but do we need to? In the // tools approach, version is specified once as part of a normal require directive, but the the location binary to be run is left up to go run. e.g. the go.mod would look like

require golang.org/x/tools v0.1.0 // tool

and you'd call go run which would use the version defined by go.mod

go run golang.org/x/tools/cmd/stringer

@jayconrod Is the reason for specifying the exact tool location as you've described (e.g.tool golang.org/x/tools/cmd/stringer) just to allow go install tools? I feel that "installing" tools might just cause the same kind of version issues between projects if installing to a common location, and go run may be the superior approach.

Comment From: jayconrod

@jayconrod Is the reason for specifying the exact tool location as you've described (e.g.tool golang.org/x/tools/cmd/stringer) just to allow go install tools? I feel that "installing" tools might just cause the same kind of version issues between projects if installing to a common location, and go run may be the superior approach.

I was suggesting tools would be a metapackage (like cmd or std), so you could use it with go install tools (hopefully setting GOBIN first), or go build -o bin/ tools or go list tools; it would work with any subcommand that accepts package arguments.

I think having a separate tool directive in go.mod is helpful for understanding why a requirement is needed. For example, suppose you stop using golang.org/x/tools/cmd/stringer in favor of a more advanced tool. At some point in the future, you might see:

require golang.org/x/tools v0.1.0 // tool

and wonder why it's there. It's not clear that it's safe to remove. But with:

tool golang.org/x/tools/cmd/stringer

you'd know that it's not used anymore, so it's safe to delete that line. The next go mod tidy would remove the corresponding requirements (assuming no other packages are needed).

Comment From: bwplotka

2c from my side: Go tools are used not only by Go modules. This is why a separate go.mod for tools or something like what https://github.com/bwplotka/bingo automated for you might be preferred. (:

Comment From: jayconrod

@bwplotka Separate go.mod files are already supported via the -modfile flag since Go 1.14.

Comment From: rsc

@bcmills, @matloob, what's the status on this?

Comment From: bcmills

I'm in favor of Jay's refinement to the proposal (posted in https://github.com/golang/go/issues/48429#issuecomment-938010150), using a new keyword (perhaps tool) to denote dependencies on specific packages that are not otherwise imported, for which dependencies would then be maintained by go mod tidy.

I think this would be a good ergonomic improvement, and it seems feasible to implement for Go 1.19 or so.

Comment From: andig

I agree with @bcmills. Having go treat tools is easily achieved by using tools.go. It takes a learning curve but is a "one-time" problem on developer side. Being able to actually install tools using the enhanced tooling suggested in https://github.com/golang/go/issues/48429#issuecomment-938010150 is something that every user of a module is confronted with.

Comment From: deefdragon

Just to confirm, but tools used by required modules or tools themselves will not be imported as indirect tools correct?

Comment From: bcmills

Correct. It would be as if they were imported by an internal (or otherwise non-importable) package.

Comment From: rsc

Putting on hold for Go 1.18 work.

Comment From: mewmew

Firstly, I also support the proposed dedicated tool directive, as this reduces the use of "magic comments" with semantic meaning, of which there are already enough in the Go ecosystem. We don't need to introduce more :)

re: go install tools by @jayconrod in https://github.com/golang/go/issues/48429#issuecomment-938010150

# Install all tools in GOBIN
go install tools

# Install all tools in the bin/ directory
go build -o bin/ tools

# Update all tools to their latest versions.
go get tools

Just a quick comment regarding tool dependencies used by Go modules. Not all external tools will be developed in Go, some external tools may be developed in arbitrary languages, but we may still wish to pin specific Git commit revisions or tagged versions to ensure that all build requirements of our Go modules are satisfied and don't go out of sync with the required versions of external tools.

A real world example of this is the Textmapper tool (written in Java) used by github.com/llir/llvm to generate lexers and parses for LLVM IR from a BNF grammar. Since the Textmapper tool is not written in Go, it is currently tracked by a Git submodule: https://github.com/llir/ll/tree/master/tools

Just put this out here, to keep in consideration when working on dependency handling of tools (e.g. build tools) required by the Go module.

Should go.mod also pin versions/revisions of tools developed in other languages than Go? If not, simply disregard this commit.

Cheers, Robin

P.S. @inspirer is working on a Go version of Textmapper (https://github.com/inspirer/textmapper/issues/6), but it has yet to reach feature parity with the Java version. (The above still applies to other tool dependencies used by Go modules and developed in other languages than Go of course.)

Comment From: deefdragon

@rsc This should be able to be taken off hold right?

Also, I am a bit confused why this went on hold in the first place. I understand that there was a lot of 1.18 work, but why did that necessitate this going on hold again?

Comment From: ianlancetaylor

@bcmills Is there more work pending, or should this come off hold? Thanks.

Comment From: bcmills

This should come off hold.

Comment From: ConradIrwin

I like the proposal to add a tool directive to go.mod.

Instead of adding go get -tool, perhaps go install should update these lines when executed within a module (though this may have too many false-positives because I don't pay attention to my working directory when installing things, it would avoid the need for new syntax to learn – and any incidentally added tools could easily be removed).

Unlike the proposal by @jayconrod, I would not merge the tools' dependencies into the package's dependencies (and I'd build the tools ignoring the go modules require and replace directives). Each tool should be built in standalone mode, because it would be surprising for me that adding a tool could affect my main module's build.

As such I'd make the syntax inclusive of a version number:

tool golang.org/x/tools/cmd/stringer@v0.2.0

At this point running go run golang.org/x/tools/cmd/stringer or go install golang.org/x/tools/cmd/stringer in the module directory would always pick up v0.2.0.

Comment From: ConradIrwin

Thinking further...

One problem with the current approach to bundling tools with repositories is that in order to manually run the tool I need to either type go run quite/a/lot/to/type every time (which is tedious),go install it (which means that I no longer can be sure I'm running the right version as I hop from project to project), or write a wrapper script (which is unnecessary).

Maybe a better way of avoiding this is to focus on fixing the ergonomics of running tools that have been bundled with the module, and making it easy to run the correct version (c.f. #57001 for go itself).

I propose the following:

  1. go.mod gets a new directive run that lets you define a set of tools for use with the current project. The syntax is run [toolname] => [path-to-run]. The path-to-run must be in the current module (starting with ./) or in a module that is required by the current module. require golang.org/x/tools v0.2.0 run stringer => golang.org/x/tools/cmd/stringer run boop => ./cmd/boop.go

    The reason that it is restricted to either the current module (or modules you depend on) is so that transitive dependencies appear in your go.mod and go.sum to give you reproducible builds, and it gives you the ability to replace tool dependencies if you wish. It is arguably a bit odd that the tool will use the same version of dependencies as your main module, but it is not likely a problem in practice, and it keeps the mental model simpler.

    The reason to explicitly specify the toolname is to allow for the case where there are multiple commands with the same name. If this isn't a case we want to support, it would be reasonable for the syntax to be run path/to/X where the toolname is inferred from the last path segment.

    Run lines may only contain one [path-to-run], so if you want to write a tool using multiple go files, you'll have to put them in a directory.

  2. go run X would look for a run directive with toolname X and (if it exists) would act as though you'd run go run path-to-run in module mode.

When run in this mode, go run would cache the fully linked binaries so that future runs of go run X do not need to re-link if the built tool is up-to-date (just as go build does). This should mean that running go run X is relatively quick the second time.

The binary will be cached at `$GOCACHE/tool/<current-module-path>/<toolname>`, so there will be at most one version of each tool cached per module. Unlike `go run` in general, this should have a relatively good hit rate. `go clean -cache` would empty this directory. (We could also add a separate `go clean -runcache` if we think it's likely people will want to clear this directory without clearing the rest of the cache).
  1. go get -run X@version would add a new run line, inferring the toolname from the last path segment; and (if necessary) add a require line for the module containing X at the given version. go get -run X@none would remove the line (and also the require if it's not otherwise needed).

Although I like @jayconrod's idea of the tools meta-package described above, I have removed it from this proposal because I think the use-case is not that clear. It would be somewhat nice to be able to "precompile all tools" so that go run X is fast the first time, but it's not essential (and it would be possible to write a script that did that). It could be a good thing to add later if there's demand.

Edit: this was updated to reflect a slightly tighter scope; and to rename the directive to "run" instead of "tool" to (maybe) reduce confusion (as go tool does something completely different from go run). (Though maybe go tool X should get this behaviour instead of go run X?)

Previous version I propose the following: 1. `go.mod` gets a new directive `tool` that lets you define a set of tools for use with the current project. ``` // stringer is a standalone tool tool stringer => golang.org/x/tools/cmd/stringer v0.2.0 // boop is a tool in the current module tool boop => ./tools/boop // protoc-gen-go is provided by a required module tool protoc-gen-go => google.golang.org/protobuf/cmd/protoc-gen-go require google.golang.org/protobuf 1.28.1 ``` If a version number is specified in the `tool` line then that version of the tool is used, it is built in standalone mode (ignoring the require/replace directives of the current go.mod). If a version number is *not* specified, then it is built in "companion mode" (respecting the require/replace directives of the current go.mod), and it must either be in the current module or in a module that is required (copying the behavior of `go run` today). 5. `go run` learns to pick up tools from the `go.mod` file: `go run stringer` would work exactly as if I'd run `go run golang.org/x/tools/cmd/stringer@v0.2.0`. 6. `go get tools` would download tool dependencies, `go build tools` would compile them and cache the result so that `go run stringer` is fast. `go install tools` would install them globally (though this may be an anti-pattern). 7. `go get -tool golang.org/x/tools/cmd/stringer@latest` in module mode would add a tool line with the latest version, and with the toolname inferred from the path of the module in addition to doing the install. It is arguably possible to omit the first argument to the tool directive in the `go.mod` and infer it from the last path segment of the second argument (`tool golang.org/x/tools/cmd/stringer@v0.2.0` would be equivalent to `tool stringer golang.org/x/tools/cmd/stringer@v0.2.0`) but this feels a bit too magic to me... This would not support non-go tools, as I think specifying a way to version arbitrary binaries is probably out of scope for go's tooling. I am not sure whether `tool stringer golang.org/x/tools/cmd/stringer@v0.2.0` should change the behaviour of running `go run golang.org/x/tools/cmd/stringer`; it definitely could pick the version up from the `go.mod`, but it may not be clear why it would given the chosen syntax. This does require teaching people to use `go run stringer` instead of `stringer` or `go run golang.org/x/tools/cmd/stringer@v0.2.0`, but I think that's a much better tradeoff all around.

See also: https://github.com/golang/go/issues/44469, https://github.com/golang/go/issues/42088 https://github.com/golang/go/issues/33468

Comment From: gopherbot

Change https://go.dev/cl/472755 mentions this issue: cmd/go: supportrundirective to go.mod

Comment From: ConradIrwin

@rsc is there a way to ask the proposal committee to take a look at this when you next meet?

There are roughly two options proposed here: which seems like the right direction, and what are the remaining things to resolve before something like this could be accepted? 1. the one proposed by @jayconrod: https://github.com/golang/go/issues/48429#issuecomment-938010150 which replaces the existing hack with a line in go.mod 2. the version proposed by me: https://github.com/golang/go/issues/48429#issuecomment-1415058683 that attempts to solve the problem more holistically.

I'd be interested in trying to implement either of these approaches (or an alternate idea) for go 1.22; but I'd love some input from you all on what makes sense as a next step; and if it'd be helpful I'm happy to write a more detailed proposal doc.

Comment From: rsc

This proposal has been added to the active column of the proposals project and will now be reviewed at the weekly proposal review meetings. β€” rsc for the proposal review group

Comment From: joeblubaugh

I have some questions about @ConradIrwin 's proposal:

  • Should run directives really participate in MVS? One advantage of scripting go install X@v1.0.0 over tools.go is that the version of the installed tool doesn't affect the version of the modules selected for building the module's code.
  • go get -run X@none feels hacky - should the user just delete the run directive instead, and run go mod tidy? If not, perhaps a change to go mod? Something like go mod remove?
  • What happens if I call go get -run -u X?

I had a thought about a separate tool management system related to go install. A file go.tools that can interact with go install and go.run:

install ./cmd/boop.go
// or, with an alias:
install anyToolName => ./cmd/boop.go
install golang.org/x/tools/cmd/stringer v0.2.0

When run without arguments, in a module with a go.tools file, go install will: * build main packages that are local to the module like ./cmd/boop.go * build non-local packages in the same way the go install currently does.

These compiled binaries will be cached in the same way as @ConradIrwin's proposal. When calling go run within a module with a go.tools file, go run will select the cached version of the binary specified by go.tools

I like that this idea cleanly separates tools used to interact with a module (and may do nothing with the Go source at all), and dependency versions required to build and test the module's Go code.

Comment From: leighmcculloch

I think @ConradIrwin's proposal (https://github.com/golang/go/issues/48429#issuecomment-1415058683) works for most of the Go repositories that I work in and I think I would use it, but it feels unnecessary given the go run command can run any Go tool already.

+1 @joeblubaugh's concern (https://github.com/golang/go/issues/48429#issuecomment-1465426311) about the tools affecting version selection. I think this would create some big surprises. There are times where it would be super useful if I'm using a tool that has a corresponding library that have to be aligned. But there are also times where it would be surprising if my transitive dependencies controlled which version of a tool was in use.

For the most part I've found the go run module/path@version to be a really effective way to run tools. I use that syntax in gogenerate directives and in Makefiles and it works great.

I think the main thing these proposals are adding is aliasing shorter names to paths. If paths are truly too long to be convenient maybe a general path aliasing system would be appropriate. Other systems have done this. For example, Deno added support for aliasing with their import map file (e.g. https://deno.land/manual@v1.31.0/basics/import_maps). I'm not advocating for aliasing, I don't think the go tool should adopt path aliasing as a feature, but that's what it feels like these proposals are adding to the go tool, narrowly for tools.

Comment From: bcmills

I think the main thing these proposals are adding is aliasing shorter names to paths.

I think there are two separate concerns.

One concern is adding tools to the dependency graph, particularly for go mod vendor. It isn't feasible to separate those dependencies from the general dependency graph, because the vendor tree (intentionally) doesn't allow for more than one version of the same package import path, and I don't think it's worth adding more complexity in order to support that.

The other concern is making it easier to run tools; I think that's what the aliasing is getting at. I could see that being useful for, say, tests that run those tools. But I think there is a lot of complexity there that would need to be resolved β€” for example, if I run go test example.com/m, how would m_test identify the stringer selected by the module from which go test is run (which may in general be different from the one containing example.com/m)?

Comment From: ConradIrwin

@joeblubaugh / @leighmcculloch I went back and forth on "should they contribute to MVS or not". Originally I thought maybe the module author should be able to chose, but the distinction is subtle. I landed on "yes, they should" to give module authors control over which dependencies are pulled in. You can of course still go run x@y (or go install x@y) if you need to not have them intermingle. For the projects I work on, it would make little difference because the dependencies of my tools are mostly distinct from the dependencies of the app (and as @bcmills points out, it would be a very big change to allow tool dependencies to differ from the main module dependencies).

I am not sure whether go get -run is needed, but I liked the idea from @jayconrod's proposal because it gives you a one-line command to run to make the change, which could be copy/pasted into documentation. The @none syntax is already supported by go get, so it made sense to me to support here too. go get -run -u would act just like go get -u (without the -run).

A separate go.tool file seems unnecessary make that much sense if the tools are participating in MVS as then any tool added would need changes in two files. It seems simpler to use the one file.

@joeblubaugh I think that change to go install you propose would be quite intrusive (I mostly use go install to "put the binary produced when building the current package on my path"). @jayconrod had proposed making go install tools do something like what you suggest – though putting them directly in the path – we could expand this proposal to support that (or do later) if it's a common desire.

@leighmcculloch glad to hear you would use this! I do hear your point around aliasing being unnecessary (and indeed I'd be happy to have something that did the versioning and caching without the aliasing). Adding the aliasing I think makes go run significantly more user-friendly (currently I either go install and then use the binary name, which leads to problems making sure the version matches between repos; or I write wrapper scripts to avoid having to type the full path; it'd be nice to just go run X instead).

Comment From: ConradIrwin

@bcmills interesting thought. Currently go test builds the test binary with the current module's dependencies, but runs the tests in the directory containing that module's code. I think this means that it will "do the right thing" in most cases – a test that shells out to go run will pick up the run directives from the module being tested.

It does mean that if you have a different version of the tool required by the main module and by example.com/m then the test will be compiled with one version but go run will run with a different version. (This already is a problem today if you shell out to go run example.com/tool in the tests of example.com/m using the current tools.go hack).

I think it would be theoretically possible to fix the version mismatch in the specific case of go test, but I'm not sure that the cost would be worth it in practice. (We'd either need a new dependency resolution mode, or code to generate a new go.mod that merges two previous ones, and an environment variable to tell go run to change its behaviour).

I'm sure we shouldn't try to fix this in the case that you go build an arbitrary binary and then run it in a different directory – if it shells out to go run then go run would have no specific knowledge of the module used to build the binary; it would just use the working directory. For me that's a pretty strong argument that it should work the same way for tests too.

I don't think this is a problem for other go commands (go run doesn't change directory, go generate only works on the current module), but are there other places it's likely to show up (and cause actual problems)?

Comment From: perj

Overall, this seems useful, the tools.go situation has always been an annoyance to me. I tend to put go run github.xom/x/y in my go:generate comments and thus rely on go.mod for the version info. I don't really mind typing go run github.com/x/y in my shell either but shortcuts are interesting, especially when working with people not that familiar with go.

The proposal so far doesn't mention any go mod edit commands to edit run lines, can I assume those will be added?

Finally, a bit of devil's advocate... currently I can create a vendor/hello/ directory and use go run hello to run it. How does this proposal address that versus the shortcuts? I'm not doing that in practice, but it's possible.

Comment From: ConradIrwin

@perj Good to know that this would be useful to you! And good call on go mod edit; we should add go mod edit -run and go mod edit -droprun.

I hadn't realized about /vendor/x – if you have both (and they point to different things) the ones in go.mod should take precedence; otherwise if you have a go.mod file with run directives when you go mod vendor the behaviour of the directive could change.

Comment From: mcandre

Meanwhile I have published a basic CLI tool to pin Go dev tool versions.

https://github.com/mcandre/accio

As much as I enjoy contributing developer tools, I would prefer to be able to deprecate this workaround and just use the builtin go mod system.

(Would also love to be able to ditch modvendor, and have go mod vendor stop deleting critical cgo source files, for the same reason. But that's uh off topic for this discussion )

Comment From: ConradIrwin

@mcandre Thanks for sharing! Would the change proposed in https://github.com/golang/go/issues/48429#issuecomment-1415058683 give you the benefit you get from accio?

I notice it takes quite a different approach (installing specific dependency versions into the path, vs requiring go run toolname), but I'm hoping that this change would solve the same problem (having the tools you need to collaborate on a module at your fingertups) albeit in a different way.

Comment From: bwplotka

Thanks for the proposal @ConradIrwin and others! Happy to see this moving forward πŸ’ͺ🏽 Similar to @mcandre I would love to switch to native solution. In the meantime https://github.com/bwplotka/bingo is still up to date and got quite some traction.

My 2c of the discussion so far, assuming we iterate over @ConradIrwin proposal:

  1. Having some aliasing is useful, although majority of those tools are used in Makefiles at the end, so we could do it in a separate iteration of the proposal. a. Side question: Can I track, within this proposal, multiple tools from same Go module under different version? (Example use case: e2e test that runs through multiple Go binaries of different minor versions).
  2. I like the version pinning potential and reusing go.mod semantics.
  3. If we are meant to use normal go.mod, I am worried about the dependency hell, so reusing the same MVC/dependency graph as the module. I know @bcmills mentioned it's not trivial, but I think it's must-have. IMO we don't want any Go modules importing my module to have pain of downloading and match all the dependencies for the tools that are not needed to compile my code. Furthermore, while this is probably controversial, many tools require custom replace directives (incompatibilities still) and users don't want to spent hours to craft dependencies of 10 tools to strictly work under the same deps together for no good benefit (unless I miss the benefits, other then downloading less overall?).

Also agree with majority of @leighmcculloch comment, except this one:

I think the main thing these proposals are adding is aliasing shorter names to paths.

If that's true, then we might be missing the point. To me the main problem of the issue we are trying to solve here is to be able to save and track versions of tools and its dependencies (including potential replace directives) in declarative way for the portability of the project development. I think it has to be a separate dependency graph as mentioned above.

Comment From: ConradIrwin

@bwplotka I strongly agree that the main issue to solve is versioning (though I think aliases help, I'd be happy to defer to a second round too). Responding to other points:

Can I track, within this proposal, multiple tools from same Go module under different version? (Example use case: e2e test that runs through multiple Go binaries of different minor versions).

Not as proposed. Similar to how it works for go libraries, you can only have one version of a given module in your go.mod.

IMO we don't want any Go modules importing my module to have pain of downloading and match all the dependencies for the tools that are not needed to compile my code

Agreed! With this proposal if your module depends on a module that has run directives, you will not inherit the dependencies of those run directives in your go.mod (this is different to how it works if you have a tools.go in a depended-on package today).

That said, we still maintain the invariant that your go.mod can only have one version of each module; so if a tool you use depends on a module that your code also depends on, you must chose one version that works with both your code and the tool. (Similarly if multiple tools depend on the same dependency, it must resolve to exactly one version).

There are some advantages to this: you can be sure that a tool used in go:generate has the same version as any library it includes in your code, and you can control the dependencies of your tools with require and replace directives exactly as you do for your own code. Would this be sufficient to solve the problems you've experienced?

(It's maybe worth noting that this proposal does not aim to solve the problem for every tool. You call out prometheus in the bingo blog post, but as you cannot go run github.com/prometheus/prometheus/cmd/prometheus@latest, this proposal isn't directly trying to make it work. Interestingly, you can force it to work by copy-pasting replace directives, but it's probably best to follow their installation instructions).

Comment From: mcandre

accio has never gained traction, too bad.

Unclear where bingo pins the version information. One of those tools that seems to discourage directly editing text file configurations.

I love Mage, and I use it to write the various development / build tasks for all my Go projects. But being a third party Go dependency, Mage itself requires pinning and a manual go install... command.

If Mage is the one and only dev dependency for a project, then it's reasonable to ask contributors, and CI/CD pipelines, and Docker images, etc. etc., to run just the one go install... command. But that means we're not using Go linter dependencies. That means we're not using Snyk CLI. That means we're not using a lot of goodies. So the list of dev dependencies is expected to grow, and we need something to automate that provisioning step. So Mage not a complete solution to our immediate problem.

Do we use sh? Require all development work to be done nested inside a Docker container? Use Ansible, God forbid?

What about writing a go-requirements.txt file with the go install... commands spelled out, one per line? It could even have chmod bits for execution in UNIX terminals. Dot slash to run on UNIX, dot backslash to run on PowerShell. Give legacy Command Prompt users the finger. Naturally, comment syntax may not necessarily be portable across all the world's different shell interpreters.

Today, the cleanest, most portable solution involves writing a (POSIX) Makefile:

.PHONY: all

all:
    go install github.com/mcandre/karp/cmd/karp@v0.0.7
    # ...

And provision the dev environment by running make.

This is perhaps the least gross workaround to go mod's lack of dev tool tracking.

Yes, make represents yet another tool, growing the technology stack. But at least (GNU/BSD) make are reasonably common companions with Go projects, especially Cgo projects. So it's not an entirely new entity.

However, makefiles are even harder to write correctly than shell scripts. There are going to be people who try to put in more than go install... or cargo install... command string literals. Any single quotes, glob asterisks, dollar signs, or file paths, will immediately break things for Go developers in Command Prompt, PowerShell, fish, BSD (t)csh, Alpine Linux, toybox/busybox, etc. etc. People will try to use (GNU) findutils in their makefiles. People will try to do way too much, when all we need is a very literal list of commands. And makefile linters are few and far between.

I'm actually planning on writing such a makefile linter to encourage extreme portability, limiting the damage.

But preferably I would prefer not to need makefiles at all, and rely solely on first party go mod to do the right thing.

Comment From: ConradIrwin

@mcandre that all makes sense – and I strongly agree with your frustrations!

Do you think the proposed solution above would "do the right thing" in your mind, or are there deficiencies we should try and address? (to summarize): * go.mod gains new syntax "run X => Y" (e.g. run karp => github.com/mcandre/karp/cmd/karp) * go run karp would run karp for you. * The version of karp to be used would be defined by require rules in the go.mod file. * go get -run github.com/mcandre/karp/cmd/karp@v0.0.7 would add the necessary require and run directives to go.mod so that go run karp works the same way for all collaborators going forward.

This comes with a secondary benefit of not polluting your $PATH, so that different projects which rely on different versions of karp can co-exist (though unlikely to be an issue in this case, it can be for linters and tools used with go:generate).

Comment From: nightlyone

Keep in mind that the proposed solution of wrapping the invocations using go run needs a bit extra work to e.g. manage protoc plugins like gen-grpc-go as you cannot control how they are invoked.

That could be solved via a wrapper program then installed at the correct location, but that would still be a cumbersome process.

On the other hand that process works very well for many interpreted languages like python with poet, JavaScript with npms npx, etc.

So maybe the Go community will adapt there as well.

Comment From: mcandre

Personally, I am not a fan of invocations like "$(npm bin)" <command ...>, bundle exec <command ...> or go run <command ...>. That requires additional typing, and often messes up invocations from noninteractive contexts, such as makefiles/magefiles.

$("npm bin)" is particularly tricky for dev environments using a Command Prompt or PowerShell interpreter. No, I do love WSL but I don't want to force my users to have to depend on WSL, if I can take quick, practical steps to provide more portable solutions that don't need it, so much the better.

But bundle exec and go run aren't much better for my purposes. I want to be able to run my Go tools as completely ordinary applications.

No, I don't want to write an alias or function for every single Go tool to automatically expand into go run <command ...>. That would be laborious. Considering that a POSIX sh family interpreter is not always the Go user's default interpreter, setting up those abbreviations becomes a polyglot nightmare.

I would prefer instead that go mod place any binaries into a fixed directory like ./.go-mod/bin or similar predictable, per-project directory structure. Then it becomes easier for users to add that directory to a PATH environment variable and run the tools as <command> ... without a prefix. No, I don't know of a perfect solution that would also allow for users to invoke Go dev dependency tools from project subdirectory CWD's. Other than overwriting Go binaries at the per-user level. No, I have not tried Go workspaces yet.

If I had to choose, I'd actually rather go mod overwrite binaries in ${GOPATH}/bin at the per-user level, as go install often does, and be left dealing with the simpler problem of per-project conflicts between versions and CLI tool names. I would prefer the manual labor of re-running go mod commands to re-overwrite with the desired binaries when context switching between Go projects, than having to use a go run prefix.

Comment From: mcandre

Currentlty working around the lack of pinning by writing the equivalent go install commands in a portable makefile. I'm prototyping a (Rust lol) validator for makefiles now, called unmake.

By the way, any Go tools not tracked in go mod, are likely not scanned for potential CVE's.

The sooner that go mod implements some form of basic CLI tool pinning, the sooner that we get Snyk SCA alerts, dependabot alerts, and so on, for security concerns in our Go buildtime dependencies.

As a workaround, and complement to go mod, go install could be more proactive about detecting CVE's, exiting non-zero, refusing to build vulnerable tools, and generally behaving closer to NPM vis a vis security reports.

Naturally, go can't yank vulnerable package versions. Being decentralized and all. But we could have go refuse to process them.

Comment From: perj

I would prefer instead that go mod place any binaries into a fixed directory like ./.go-mod/bin or similar predictable, per-project directory structure.

I think this sounds a lot more complicated to use. At least for some larger projects where I'm likely to cd into a subdirectory, it would require setting up direnv or similar to keep track of the path to this directory.

Go run would work from any directory inside the module, presumably. There's also precedence in that unit tests and generate are run from the package directory, not the module one.

Your suggestion could be useful too, but perhaps as a second step?

Comment From: ConradIrwin

One thing that we might want to bring back from the original proposal is go install tools, so that you could do GOBIN=./.go-mod/bin go install tools.

We could also change the behaviour of go install stringer (if you had run stringer => golang.org/x/tools/cmd/stringer), which might help for the protobuf case... but I'm not sure it's really something to encourage. (And it's a bit unclear whether you want to bind your version of the protoc plugin to protoc which is globally installed, or to your codebase... Somewhere between the two?)

Comment From: rsc

Talked to @bcmills and @matloob. We agree that this should be a tool line and not a run shorthand, so closer to Jay's proposal than Conrad's.

What we're confident about is:

  1. Add a tool mod/ule/path line that defines a tool, with a meaning exactly like import _ "mod/ule/path" in a tools.go file.
  2. Add the -tool flag to go get to add/remove a tool line along with a require line for the tool's module.

We are less confident about whether there should be a shorthand way to run a tool. Perhaps, but perhaps not. If we were going to do that, I think we would use go tool <name>, where name is the last element of a known tool and does not conflict with pre-installed tools. For example if your go.mod said

tool golang.org/x/tools/cmd/stringer
require golang.org/x/tools v0.9.0

then perhaps go tool stringer would be equivalent to go run golang.org/x/tools/cmd/stringer@v0.9.0. Or perhaps that would be best left for a future proposal.

Or perhaps the answer is go install tools instead of that go tool stringer, but install writes to a global directory shared by all modules, and there is significant possibility of conflict. In contrast, go tool stringer could make sure to use the correct version even as you switch between modules, running it directly from the build cache.

Comment From: seankhliao

Note that we previously declined running external commands under cmd/go in #48411

Comment From: ConradIrwin

Great! Thank you @rsc, @bcmills and @matloob for taking the time to talk this through.

I’m happy to take on implementing that over the next few weeks.

What do you think about having go run cache binaries for tools in the current module?

This would make a shell script that did go run golang.org/x/tools/cmd/stringer faster for subsequent runs (and while not so important in this case, matters more if the tool links a larger code base).

I think it would be best to include it in this round, but we could defer it to a subsequent discussion if we want to explore exactly what should be cached and for how long in more detail.

Comment From: ConradIrwin

@seankhliao i think the concerns from #48411 are more specific to the idea of go X than go tool X or go run X.

The major advantage that this mechanism has over that is that we’re not depending on the outside environment ($PATH), but the lookup is defined clearly in your go files.

I do think it would be good to make go tool X work as part of this change. Although it’s slightly less easy to type than go run X it still gives you the benefit of not having to write wrapper scripts for everything. And I like that it keeps a consistent name for a consistent functionality (versioning a tool you use to develop a go program).

The main question is handling conflicts - it is definitely safer to make sure go’s tools take precedence; and I like the simplification of not having aliases in the file. That slightly restricts the names of tools, but it’s probably ok (given that $PATH has the same uniqueness restriction). In writing I’ve convinced myself that the approach to conflict resolution outlined above is correct.

Comment From: mvdan

What do you think about having go run cache binaries for tools in the current module?

For reference, that's very close to https://github.com/golang/go/issues/33468. I personally use lines like //go:generate go run pkg/to/main args..., but sometimes I admit it's just too slow. For example, with https://github.com/kyleconroy/sqlc, re-building the binary (which I assume is little more than a link step) takes about one second (time go run github.com/kyleconroy/sqlc/cmd/sqlc@v1.17.2 -h). One second isn't too bad, but it adds up towards go generate ./... chugging CPU for a while and slowing down my development.

Comment From: ConradIrwin

@mvdan strongly agree! I like the idea of having go run always cache, but I noted @rsc's response on https://github.com/golang/go/issues/25416#issuecomment-401882645

To be clear, while @ianlancetaylor explained the state of the world without expressing a preference on policy, I will express a preference on policy: we don't want to start caching binaries just so that people can "go run path/to/binary" instead of installing binaries. Installing binaries is good!

I think the new observation (which may be new in the half-decade since that note!) is that installing binaries is not actually that good if you're working on multiple projects that require specific versions of tools (which is a state I find myself in when balancing work and open-source and personal side quests).

I think a reasonable first step in the right direction is to add caching for tools that are listed in go.mod (whether or not they're available as go tool X). A key insight that I think makes this both higher value and lower cost than general go run caching is that we can create a specific cache directory per module that contains each tool by name.

Once we have opt-in metrics though it would be really interesting to measure how much impact caching all instances of go run would have – I do know that at work we updated our tooling to use go build and then run the binary instead of using go run to get caching behaviour back again.

Comment From: mcandre

As a workaround, I am pinning my tools using go get commands in a makefile.

make is reasonably portable, even on vintage Windows development environments. make is already commonly installed as part of a cgo development environment, and it is lightweight.

And then I lint the makefile with unmake to safeguard it.

https://github.com/mcandre/karp/blob/master/makefile

I would love to see Go tools pinned in a standard text file, similar to Python requirements-dev.txt. In fact, my projects tend to depend on Go tools + Python tools + Node.js tools, so either way, I'll still be running some go get command from a makefile or other provisioning script to tie cross-language dev dependencies together.

Comment From: rsc

Enabling caching of binaries run this way using 'go tool' is fine. We still don't want to cache arbitrary 'go run'.

As far as conflicts with go tool shortname, go tool compile should always mean the Go compiler. If there is a example.net/compile tool in the go.mod file, go tool compile still means the Go compiler. For non-builtin tools, if there is more than one tool with the same final path element, say example1.net/stringer and example2.net/stringer, then go tool stringer is an error.

Comment From: fzipp

As far as conflicts with go tool shortname, go tool compile should always mean the Go compiler.

Will this become a problem if someday a new builtin tool is added to the Go distribution which happens to be named "stringer"?

Comment From: ConradIrwin

@fzipp It is a problem in theory, but I think it's unlikely in practice that the go team would chose to add a tool to the distribution that conflicts with a well known tool in the ecosystem.

The same theoretical problem has always existed with $PATH and in practice we all do a relatively good job of having relatively unique binary names.

Comment From: ConradIrwin

I wrote up a more concrete proposal of the work that I think is needed here:

https://go.googlesource.com/proposal/+/6357993223edf8118789c2a68c85de959b0fb5ef/design/48429-go-tool-modules.md

Please leave comments here (or in Gerritt): https://go-review.googlesource.com/c/proposal/+/495555/1.

Comment From: willfaught

Why not require the full path in the go tool command? Then it's unambiguous. It's no more verbose than the go run command:

$ go run github.com/kyleconroy/sqlc/cmd/sqlc@v1.17.2 -h
$ go tool github.com/kyleconroy/sqlc/cmd/sqlc -h

Comment From: ConradIrwin

It seems reasonable to support the full path in addition to the shortname.

That said, one of the key things I'd like to improve with this change is making it easy to version tools and run them. Typing out the full path doesn't make it easy.

Comment From: rsc

Saying the full path should be allowed for disambiguation, but it almost certainly won't be the common usage.

Comment From: mvdan

I'd like to flag a minor point: right now, I personally tend to do //go:generate go run foo.com/some/module/cmd/tool@v1.2.3, as opposed to the old build-ignored tools.go approach, which was similar to this proposed design in that it would add foo.com/some/module to my module graph.

Like @ConradIrwin mentions in https://github.com/golang/go/issues/48429#issuecomment-1481565052, adding tool modules to the module graph with MVS can be good. For example, when using https://github.com/protocolbuffers/protobuf-go as both the Go library and the Go code generator, it's a good idea to use the same version with both.

However, in other cases I only want to use a tool independently, such as https://github.com/kyleconroy/sqlc, which is a code generator only. Its go.mod has quite a bit of stuff, and now it would be in my module graph as well. Not a huge concern in general, but it does add noise, and it might bump some of my direct or indirect dependency versions due to MVS - which is strictly speaking not necessary.

Comment From: rogpeppe

One concern I have: since dependencies can be tricky and it's nice to be using a consistent tool across projects, I wonder if the tool line should be able to specify the exact version of the tool to use, instead of taking the dependencies from the main module.

Something like:

tool golang.org/x/tools/cmd/stringer@v1.2.3

When the version is specified, you get exactly that version of the command, evaluated in its own main module. Otherwise you get the version implied by the current main module.

Edit: this is pretty much exactly what @mvdan said above :)

Comment From: rogpeppe

When the version is specified, you get exactly that version of the command, evaluated in its own main module.

This would follow the semantics of whatever go install $pkg@$version does. Specifically, if we changed go install to allow and respect replace directives, then tool $pkg@$version would do that too.

Comment From: rogpeppe

One other thought: it's possible that there might be name conflicts between tools (for example we might want to use two different tools named generate from different projects). We could consider allowing a two-argument form of tool which specifies the command name of the tool.

For example:

tool foo_generate foo.com/cmd/generate@v0.1.2
tool bar_generate bar.org/cmd/generate

This would also mean that it would be possible to specify more meaningful, locally relevant, or stable names than those provided by the author of the tool.

Comment From: ConradIrwin

@mvdan @rogpeppe re: being able to run a tool as its own main module.

We discussed this a bit above, and it's definitely something that people do today, and to be clear, it's not something that will change when this proposal is accepted (you will still be able to do //go:generate go run foo.com/some/module/cmd/tool@v1.2.3 and ignore the tool support).

As you mention there are some tools where it's important that the versions match; and there are other tools where it doesn't matter so much.

The primary downside to adding support for both is that it results in a subtle distinction: how should anyone decide whether they want tool x/y@z or tool x/y? (And how would we add support for go get -tool x/y@z that allows for both; would it make sense to allow go get tools still?).

There are few tools (maybe none) where it is important that they are not added to your dependency graph, so I think the right approach is to only support versioning tools and their dependencies explicitly for now. (The other thing this change makes better is you can be sure that anyone depending on your module will not inherit your tools' dependencies; so any increase in go.mod size is local).

As an aside, the other worry I have about tool x/y@z is how well build reproducibility works – the tool would presumably not be in go.sum, nor would any of the dependencies. While we have the module proxy it "should be fine", but it seems a lot less solid.

Comment From: ConradIrwin

@rogpeppe re: aliases.

I like the idea of having an optional alias. It may not be a problem in practice (e.g. $PATH is a global namespace), but it seems like a potentially better solution than just disallowing conflicts.

The main downside is parsing complexity in go.mod, but it doesn't seem that complicated.

Comment From: rogpeppe

@ConradIrwin

We discussed this a bit above, and it's definitely something that people do today, and to be clear, it's not something that will change when this proposal is accepted (you will still be able to do //go:generate go run foo.com/some/module/cmd/tool@v1.2.3 and ignore the tool support).

If you do that, then you won't (presumably) be able to take advantage of the support for go tool $tool sugar, which would be a bit frustrating: you follow "good practice" for a reproducible tool dependency and you pay a price for it because you now need to explicitly mention the entire path and version in every go:generate directive.

The primary downside to adding support for both is that it results in a subtle distinction: how should anyone decide whether they want tool x/y@z or tool x/y? (And how would we add support for go get -tool x/y@z that allows for both; would it make sense to allow go get tools still?).

I'd suggest that the @z form should be the default, and some other variant form (e.g. -main) would be used to explicitly bring the tool into your actual main module dependencies.

There are few tools (maybe none) where it is important that they are not added to your dependency graph

I tend to disagree. I think that keeping dependency graphs small is important, and any help we can provide in that direction is good. Also, as I said earlier, I think it's nice to be using a consistent build for a tool so different projects aren't all using slightly different variants by virtue of being included in different dependency graphs.

As an aside, the other worry I have about tool x/y@z is how well build reproducibility works – the tool would presumably not be in go.sum

Is there any particular reason why it and its dependencies couldn't be in go.sum? I can't think of one for now.

Comment From: ConradIrwin

@rogpeppe re " I think that keeping dependency graphs small is important". I'm curious why this is important to you?

I think it's nice to be using a consistent build for a tool so different projects aren't all using slightly different variants by virtue of being included in different dependency graphs.

Yes, I agree with you here; it would be ideal for the tool author to have control of the versions of dependencies used (and as you say, in order to make this really possible, we'd want to use the replace directive's from the tool's go.mod – c.f. the reasoning given here: https://golangci-lint.run/usage/install/#install-from-source). There's always going to be some tension though, as the tool user will always have the ability to override if the need to.

Supporting this extra case does complicate things though, and I'm not sure the complication is worth it to pander to tool authors (who, as in the case of golangci-lint already recommend using a different mechanism).

The challenges I think are: * Anyone analyzing the dependency graph will need to pull in tools using a separate mechanism (e.g. vulnerability scanners will have to learn how to transitively list a tools' dependencies in this case as they won't be listed in go.mod) * We lose the ability to make things like a tools metapackage work easily, because such a package may need to depend on multiple versions of the same modules (which is not currently supported). This may not be a huge loss, and we would definitely be able to make things like go install tools or go test tools work regardless of how it's implemented under the hood. * User confusion (as above) as to what a given tool line in the go.mod actually does.

It may be that we want to take on the extra cost for the benefit of tool authors, and although we probably would need a separate discussion around fixing tools with replace directives, it does seem sensible to fix both around the same time.

Is there any particular reason why it and its dependencies couldn't be in go.sum?

They could be I suppose. We'd just need to teach the tools that maintain go.sum about a new set of dependencies that are not listed in go.mod the same way. (It goes back to the first of the three challenges above).

It seems like what it really comes down to is: is the added complexity in implementation and user interface worth the added benefits? I can definitely see arguments in either direction, but curious if there are other thoughts in either direction?

Comment From: rsc

Got sidetracked by the release, back to this conversation now.

@ConradIrwin The doc you sent looks great. The only part I don't understand is the mention of a special error from 'go run'. I thought we had decided that 'go run' has nothing to do with tool lines.

@rogpeppe, I believe you've made two suggestions:

  1. Allow / encourage tool lines to specify the 'shortname' form for use with 'go tool shortname' form.
  2. Allow / encourage tool lines to specify pkg@version, keeping tools out of the module graph.

For (1), what is the case you are worried about? Today people 'go install' tools and somehow we get by with a single directory. It's difficult for me to believe that two different generator tools are going to pick the same name accidentally. (And we certainly shouldn't be encouraging them to do it on purpose.) In the rare, accidental case, falling back to 'go tool full/path' seems fine.

For (2), one of the simplifying assumptions of the go command, both in its internals and its behavior, is that there's only one build graph. So if you run commands like 'go mod graph' or 'govulncheck' or 'go list' or anything else, it will tell you about the one build graph. Allowing 'tool pkg@version' introduces a separate build graph for use with that specific command. The go command is not going to start dealing with multiple build graphs in a single command - that would be a very large amount of implementation complexity, far outweighing the entire benefit of having tool lines at all. So go tool pkg would have to just behave like 'go run pkg@version' and otherwise the tool lines would be ignored by other commands. Some of the concrete drawbacks of doing that include:

  • You can't see the dependencies of your tools in things like 'go mod graph'.
  • 'go mod tidy' can't see the tools, so it won't write go.sum lines, so 'go tool pkg@version' has to consult the checksum database on each invocation.
  • The tool dependencies are not considered by 'go get' when updating during 'go get -u'.
  • The tool dependencies are not considered by 'go get' when maintaining the go and toolchain lines, so there is nothing to stop you from having 'go tool pkg@version' need to switch to a new toolchain every time it runs too. 'go tool' is not going to be a go.mod-writing command, so every single invocation may need to do the switch. This is problematic.

I'm sure there are more drawbacks I am not thinking of. I don't see what the benefits of this approach are. The only justification I see above is "dependencies can be tricky and it's nice to be using a consistent tool across projects". I would argue that dependencies being tricky is a reason to include the dependencies in the standard build graph, so that we can bring all our standard tools to bear on the trickiness. And I think using consistent packages across all parts of one project outweighs using a consistent build of a tool across different projects. If there's some log4j-esque problem I want to be able to 'go get log4j@latest' and know that my project is now completely safe, including 'go tool foo' invocations.

Comment From: ConradIrwin

@rsc Thanks!

Currently the error message from go run is:

$ go run golang.org/x/tools/cmd/stringer
no required module provides package golang.org/x/tools/cmd/stringer; to add it:
    go get golang.org/x/tools/cmd/stringer

This is a little unhelpful, because when you run go mod tidy it will remove the requirement again.

The suggestion is just to improve the error message:

$ go run golang.org/x/tools/cmd/stringer
no required module provides package golang.org/x/tools/cmd/stringer; to add it:
    go get -tool golang.org/x/tools/cmd/stringer

This will add the tool line to go.mod and add it to your module in a way that will outlast any go mod tidy's.

After you have a tool line, go run golang.org/x/tools/cmd/stringer will work (because the module will be required) even though I expect go tool stringer to be more used.

Comment From: rsc

@ConradIrwin, I think that error message change is not correct. If people are using "go run" then we shouldn't tell them to do something that is meant to be used with "go tool" but accidentally also makes "go run" work. Honestly I think the "to add it" note should probably be removed entirely. There are two possible ways to fix that error. The other is to add @version to the command line. Often that's what you want instead. We should stop presuming one solution.

Comment From: ConradIrwin

Makes sense. I've removed that work from the proposal, and have filed an issue to track fixing that error https://github.com/golang/go/issues/60944

Comment From: rsc

Have all concerns about this proposal been addressed?

Comment From: mvdan

I don't see what the benefits of this approach are. The only justification I see above is "dependencies can be tricky and it's nice to be using a consistent tool across projects". I would argue that dependencies being tricky is a reason to include the dependencies in the standard build graph, so that we can bring all our standard tools to bear on the trickiness.

I tried to explain my reasons in https://github.com/golang/go/issues/48429#issuecomment-1551584506. Trying to answer your question more directly, using sqlc as an example of a generator tool:

  • It's a generator whose dependencies are unrelated to my own dependencies.
  • It has quite a few of those dependencies; there's little reason for me to stick them in my go.mod.
  • Some of those dependencies might bump my go.mod versions due to MVS. Again, I don't see a reason to do that.

I will once again say that folding the tool versioning into the main build graph can be good with some tools that do need the consistency in versioning, like protoc-gen-go with the protobuf libraries. However, with some others like sqlc, there's no such thing, and that's why I currently use go run github.com/kyleconroy/sqlc/cmd/sqlc@v1.18.0. It would be mildly unfortunate if using the proposed approach strongly encouraged adding all tool dependencies to go.mod regardless of this distinction.

Comment From: ConradIrwin

  • It’s a generator whose dependencies are unrelated to my own dependencies.
  • It has quite a few of those dependencies; there’s little reason for me to stick them in my go.mod.
  • Some of those dependencies might bump my go.mod versions due to MVS. Again, I don’t see a reason to do that.

I think there are two specific concerns: * It might add a bunch of new dependencies to my go.mod (and I don’t see a reason to do that) * It might bump the version of some existing dependencies (and I don’t see a reason to do that)

The reason is to give you knowledge of and control over the code that you run.

There’s an important question: do you need knowledge/control of your tools dependencies? As you imply, the vast majority of the time it does not seem to matter; and so β€œno” seems like a reasonable answer. However there will always be some cases where it does matter (either because of a runtime dependency, or a security firedrill like @rsc's log4j example). That implies we should give people tools to observe and control their tools dependencies.

Given that, there’s a question of: how should we enable this? Rather than inventing a new mechanism, re-using the existing mechanism is much better: it’s only one tool to learn (and only one set of config files in the repo :D).

Of the two downsides you mention, I am still not sure why β€œadding new dependencies” is a problem. It’s relatively cheap to have a few more lines in the main modules go.mod; and anyone who depends on your module will not inherit your tool’s dependencies.

I am more sympathetic to the β€œchanging versions of shared dependencies” – that can sometimes make a difference (in the case that the tool or your app depends on a library that contains a change that breaks the tool or your app). In that case updating the library as part of installing a tool is going to surface a problem. It’s worth noting that the breaking change and the incompatibility is a problem that already existed, if the tool and the app both want to keep their dependencies up to date (which they probably do), the incompatibility will need fixing at some point – all this does is make you aware of it now.

You don’t have to fix any problems that arise immediately. You can continue using go install or go run instead, or use a replace to incorporate a local fix or an exclude or require to select a better version.

As an experiment, I added sqlc@1.18.0 to my companies backend repository. We previously had 201 dependencies listed in go.mod, installing this added 13 new ones, updated 1 that our app used, and 4 that other tools in our tools.go used. It didn’t cause any problems.

It's better to give people knowledge and control over the code they run, and the extra data tracked in go.mod and go.sum is a small price to pay.

Comment From: thepudds

FWIW, I am simultaneously: * (a) thrilled to see tools given more support, such as outlined in this proposal, and * (b) very concerned that implementing this proposal (as it stands and without other changes) might result in local convenience that leads to a net increase of global inconvenience.

Part of Go's mission is for software that scales. I’m concerned this proposal (alone and as is) would likely end up unnecessarily exacerbating some challenges faced especially by medium to large Go projects. One sample writeup is in https://github.com/golang/go/issues/52296#issuecomment-1097143694 and elsewhere in that issue (which in summary describes how large projects in particular can frequently have some indirect dependencies deep in the dependency graph that forces the whole dependency chain to update to a problematic version of something, which often can be traced to the blending of dependencies that occurred via a tools.go file, which then causes unnecessary challenges, including for consumers of the large projects. Module pruning helps but does not eliminate such problems).

If this proposal is currently targeting the "blend in my tool dependencies" use case, ideally it would happen at the same time as something that makes it easier to manage the use case of "keep my tool dependencies separate".

Otherwise, I worry the thumb might be pressed too hard on the scale for "blend in my tool dependencies".

(If we slice the cases slightly differently, I suspect the "I need to blend in my tool dependencies" case is less common than the "it's fine for my tool dependencies to be separate" case. Whether an individual project blends or separates their tool dependencies is probably less heavily influenced by whether they need to blend in their dependencies, and is instead likely more heavily influenced by what's easier and what's considered a "blessed" approach. Regardless of which case is more common, I think it's likely fair to say neither case is rare, and we probably don't want to overly steer people away from either case).

Prior to approving this proposal, I think it probably would be wise to explore in greater depth if the "keep my tool dependencies separate" use case could be improved contemporaneously with this proposal (either via other proposals, or by perhaps extending this proposal).

Ideally, the two cases would be on equal footing in the tooling, or at least the same rough ball park of convenience. (Ideally they would be ~tied in terms of convenience, but if you look at things like (1) initial setup vs. (2) day-to-day use vs. (3) maintenance operations such as upgrades, it's probably reasonable if for example one use case a bit less convenient to set up if the day-to-day convenience is still close to a tie).

Comment From: thepudds

I don't really have a concrete suggestion, but:

Half-baked idea 1

One older thought I had was wondering if there could be an alternative "blessed" layout for tools.go that keeps dependencies separate with some small-ish tooling tweaks to make it more convenient.

For example, if someone sets things up like so:

.
β”œβ”€β”€ go.mod             // my primary module
β”œβ”€β”€ mycode.go
└── tools
    β”œβ”€β”€ mage
    β”‚   β”œβ”€β”€ go.mod     // nested module for mage (dependencies separated)
    β”‚   β”œβ”€β”€ go.sum
    β”‚   └── tool.go
    └── stringer
        β”œβ”€β”€ go.mod     // nested module for stringer (dependencies separated)
        β”œβ”€β”€ go.sum
        └── tool.go

then the question would be what tooling changes would make that convenient?

For example, one thought I had after seeing Tim Hockin propose the -C flag was that we could try to lean into that for this use case, with something like:

Set up (one time per tool)

$ mkdir -p ./tools/mage
$ cd ./tools/mage
$ go mod init tools            
# use editor to manually create tools.go file
$ go get github.com/magefile/mage@latest

Install (from top level of my module)

$ go install -C ./tools/... <something>         # all tools to GOBIN
$ go build -C ./tools/... -o ./bin/ <something>  # all tools to some dir

Update (from top level of my module)

$ go get -C ./tools/mage github.com/magefile/mage@latest   # update one
$ go get -C ./tools/... <something>                        # update all??

...but that's not ideal, and also not fully specified.


Half-baked idea 2

A separate thought would be to adjust this current proposal to allow an optional tools directory in the top level of the module (parallel to a vendor directory, if any), where a go tool command would know to look inside that directory for a nested module.

The layout could be similar to idea 1, but now with no tools.go:

.
β”œβ”€β”€ go.mod             // my primary module
β”œβ”€β”€ mycode.go
└── tools
    β”œβ”€β”€ mage
    β”‚   β”œβ”€β”€ go.mod     // nested module for mage (dependencies separated)
    β”‚   └── go.sum
    └── stringer
        β”œβ”€β”€ go.mod     // nested module for stringer (dependencies separated)
        └── go.sum

Set up (one time per tool)

$ mkdir -p ./tools/mage
$ cd ./tools/mage
$ go mod init tools      # not sure module name matters? maybe not 'tools'?
$ go get -tool github.com/magefile/mage@latest

Run (from anywhere within my primary module)

$ go tool mage            # knows to consult 'tools' dir

Install (from anywhere within my primary module)

$ go install tools         # output to GOBIN
$ go build -o ./bin/ tools  # output to some dir

Update (from anywhere within my primary module)

$ go get tools                                  # update all
$ go get -tool github.com/magefile/mage@latest  # update one; would also work inside ./tools/mage

I don't know if that works... but maybe someone will have a more concrete idea.

Comment From: zephyrtronium

If this proposal is currently targeting the "blend in my tool dependencies" use case, ideally it would happen at the same time as something that makes it easier to manage the use case of "keep my tool dependencies separate".

It seems like go run github.com/kyleconroy/sqlc/cmd/sqlc@v1.18.0 already implements the "keep my tool dependencies separate" case. It is the "blend in my tool dependencies" case that currently ranges from awkward to hard. Maybe I'm naΓ―ve, but wouldn't both go in a go:generate line or a script anyway, to keep arguments controlled in addition to versions? If so, I don't really see the proposal as being dramatically more or less convenient than versioned go run.

Comment From: jimmyfrasche

Ideally you should be able to use and manage tools the same way whether their dependencies are intermingled or separate. The only time you should have to think about the difference is when you add the tool.

Comment From: bcmills

I've been thinking about how to reduce the impact of the tool dependency on downstream consumers of your module, particularly in the context of graph pruning.

The key invariant for graph pruning is this: your go.mod file needs to explicitly record versions for everything imported by your packages and tests. That way consumers of your module can test the packages they import from your module without chasing down your whole module graph.

The all pattern includes all packages transitively imported by the packages and tests in the main module, and we use that to enforce the above invariant. If a package is in all its module must be listed explicitly in the go.mod file, and otherwise (that is, if it is a test-only dependency of an outside package) it is only listed if its selected version is not already implied by the other dependencies.

Your tool dependencies are necessary not imported by any package or test in your module. Therefore, nothing will break for your users if we leave them out, the same as we do for external test-only dependencies.

So, we have a few choices we could make here: 1. Should each module that provides the package main for a tool be listed explicitly in the go.mod file? - I think it should, mainly to avoid confusion about which version of the tool is to be used. 2. Should the modules that provide packages transitively imported by tools be listed explicitly in the go.mod file? 3. Should tools be included in the all pattern, for commands like go test all? 4. Should the packages transitively imported by tools be included in all?


  • If we record tool versions explicitly but don't include tools in all, then go list -deps -test all no longer includes all of the packages that go mod tidy cares about.
  • If we include tools in all but don't include their transitive imports, then the set all is no longer closed over β€œimports”.
  • If we include tools and their transitive imports in all but don't record module versions for transitive imports, then go test all is no longer reproducible. (We don't have enough information in the pruned module graph for the test dependencies of those transitive imports.)
  • If we record module versions for the transitive imports of tools, then tool dependencies have a somewhat larger impact on downstream MVS results.

I feel strongly that we should not break the reproducibility of go test all, but I don't have a good intuition for the other tradeoffs.

Comment From: ConradIrwin

@bcmills I'm sure you've got more context on this than me, but I'm sure the answer is "yes" for 1 and 2 (that way we get all the benefits of managing tool dependencies that we want, and visibility into the versions being used, etc.).

If you use the tools.go workaround correctly (i.e. it's in its own package in your module that no-one depends on, with build tags that exclude it from ever being built), then 3 & 4 will also currently be true. I think that's a strong argument for making the answer "yes" to those questions too.

It would be possible to list tools and their dependencies in go.mod but not have them be in the "all" metapackage; but I think that will likely feel inconsistent. For example, if you do go get -u all it may update dependencies shared by tools; so go test all should test your tools and their dependencies too.

Comment From: ConradIrwin

@thepudds Thanks for the link to that comment, it's good to hear problems first-hand.

I think having first-class support for versioning tools and their dependencies might be worth living with for a while before we try and extend it further. The goal of this proposal was to make an easier to use alternative to the tools.go workaround.

The right solution to keeping dependencies separate might be more general: in a project with multiple build targets, how do you delay updating the dependencies of one while updating the dependencies of another? That doesn't seem like a tool-specific problem (though it may feel more annoying with tools because most of the time you don't really want to care about their dependencies).

I like the direction of your ideas of re-using the go.mod solution for versioning, but having multiple of them – it avoids complicating things for the common case, but allows people an escape hatch if they need it. I am a bit skeptical that we should try and define the exact structure and bake that into go tool yet.

If you set up the project structure as you defined above (with a go.mod/go.sum per tool), you could implement something like this.

# go.mod
tool example.com/run
// example.com/run/main.go
package main

func main() {
  tool := os.Args[1]
  os.Setenv("GOMOD", "tools/" + tool + "/go.mod")
  exec.Command("go", append([]string{"tool", tool}, os.Args[2:]...)...).Run()
}

And then you could:

$ go tool run stringer

It's still not perfect, but it's a lot better than what we have today. If lots of people start using something like example.com/run then it could make sense to bring that functionality back into go tool itself.

Comment From: gopherbot

Change https://go.dev/cl/508355 mentions this issue: modfile: Add support for tool lines

Comment From: leighmcculloch

  • (b) very concerned that implementing this proposal (as it stands and without other changes) might result in local convenience that leads to a net increase of global inconvenience.

I am also concerned about this. Specifically that tools in go.mod will be so convenient that it will become the default way to specify tools, at the expense of tool dependencies having side-effects on library dependencies.

It seems like go run github.com/kyleconroy/sqlc/cmd/sqlc@v1.18.0 already implements the "keep my tool dependencies separate" case.

I think deps separated is a better default for libraries. We have go run path@version that is a pretty good solution for when we want deps separated, but if tools in go.mod is more convenient, deps will probably be more often incorporated rather than separated, at the expense of importers.

Is the plan for this proposal when implemented for tools in go.mod to mix the deps of tools with libraries?

Comment From: zikaeroh

I work with enough code generators that are also libraries that I'm strongly in favor of the dependencies "mixing" i.e. there only being one module graph. For example, sqlboiler and gqlgen are tools I could mark as tool deps, but the code they generate also imports their libraries. If a transitive dep ends up mismatching between the two trees, it's possible that my generated code will be wrong (silently or not), whereas a single module graph can never have that problem, which is the current state of things with tools.go.

Comment From: zephyrtronium

@leighmcculloch

..., but if tools in go.mod is more convenient, ...

This is my point. It doesn't seem like it would be more convenient for anything but the specific case where you want its dependencies tracked in go.mod. Say there is some arbitrary tool at gitlab.com/bocchi/bocchi. The cases I think would be common are:

  • Running it with dependencies tracked in go.mod. I can't imagine a scenario where I need the tool's dependencies tracked in go.mod but where the tool and my code will not interact, so it follows that I would always add a go:generate line for this. Under this proposal, I add //go:generate go tool bocchi -flag=flag once and am done.
  • Running it with a pinned version but with dependencies separated. Once again, if I need a pinned version for a given project, why would I not add a go:generate line? It would have to be a tool which breaks some interaction at some later version but which I invoke with arbitrary command lines; I can't picture what that would be. So, I add //go:generate go run gitlab.com/bocchi/bocchi@v1.1.0 -flag=flag once and am done.
  • Running it without a pinned version. I don't see why it would make sense to track the versions of dependencies of the tool but not the tool itself, so it doesn't affect or appear anywhere in go.mod. The most convenient way to use this is and remains go install gitlab.com/bocchi/bocchi@latest.

That said, I tend not to use many external tools in my development, so I wouldn't be especially surprised if there are indeed tools that don't fit those assumptions. And the fact that the go tool method is the only one which has a mechanism in the go tool to automatically check for updates might be enough to qualify as "more convenient," even if that could be addressed otherwise.

Comment From: leighmcculloch

It doesn't seem like it would be more convenient for anything but the specific case where you want its dependencies tracked in go.mod.

I expect go tool bocchi will appear more convenient than go run gitlab.com/bocchi/bocchi@v1.1.0 and so will be preferred in all cases on the basis of appeared simplicity, but the only evidence I have for this is anecdotal. I'm just concerned the appeared simplicity will lead to it being preferred when it shouldn't be. (The anecdotal evidence being that folks I worked with did not like the verbosity of go run in all our go generate comments, and switched to the tools.go version without realizing the impact on the dependency graph, leading to the change being discarded.)

Comment From: myitcv

I've been thinking about how to reduce the impact of the tool dependency on downstream consumers of your module, particularly in the context of graph pruning.

I've not been keeping fully abreast of this thread over time, so please disregard my comment if it misses the mark.

Whilst it seems necessary and reasonable to me that a dependency's own test dependencies could/should influence the set of the dependencies of the main module (not least because I want to be able to run tests on a dependency), the tools used by the dependency author do seem like an entirely orthogonal concern as a consumer of that module.

Taking an extreme example, https://github.com/tetratelabs/wazero is a zero dependency dependency. That is, it's go.mod is empty with respect to dependencies. If that project were to start listing tools in the go.mod, we would be in the somewhat strange position that wazero starts to impact my main module dependency graph. Unless I have missed something?

That feels like a leaking of an abstraction that can and should be limited to contributors the dependency, not consumers.

Comment From: ConradIrwin

@myitcv That's correct: people who depend on wazero will not inherit any dependencies from tools wazero uses.

Comment From: myitcv

@myitcv That's correct: people who depend on wazero will not inherit any dependencies from tools wazero uses.

I'm unclear from @bcmills' answer that tool dependencies of wazero will definitely not affect my main module's build graph. Because it sounds like they will.

(I don't know what you mean by "inherit" incidentally)

Comment From: leighmcculloch

In this proposal will a library that has wazero v1.2.0 as a tool require all its importers to use at least wazero v1.2.0 if they are also using it as a tool or a library?

Tools occur to me much like replace directives. They only need to affect the developer working on that specific project. E.g. If there is a module A using tool wazero v1.2.0, and a module B that imports module A and uses tool wazero v1.1.0, it should be okay for module B to use wazero v1.1.0 and not v1.2.0.

Is that how it will work in the proposal? Or will tool versions propagate to importers?

Comment From: willfaught

Perhaps the go run command should invoke tools instead of a new go tool command. We already use it for go run example.com/foo@latest, so why not also enable go run example.com/foo, where example.com/foo@vX.Y.Z is in go.mod, and go run foo, where example.com/foo@vX.Y.Z is in go.mod and there isn't a name ambiguity.

Comment From: ConradIrwin

@leighmcculloch I see the confusion now, thanks for clarifying.

The proposed solution is to have the same impact on dependency versions as the current tools.go, the advantages are that it is easier and faster to use. To clarify:

If you require a module that uses a tool, your go.mod will not gain new require lines for the modules needed by that tool or its dependencies. So even if wazero used tool lines, requiring wazero will not add any require lines related to its tools to your go.mod.

If you require a module containing a tool, and you have a shared dependency with that tool, and the tool requires a later version than your main module then requiring the module will also cause minimum version selection to select the later version of your dependency (the one specified by the tool). So if wazero used a tool lines, requiring wazero might bump the version of one of your existing dependencies. (This is already true today: adding a dependency may bump versions, and is exactly how tools.go works today).

Fixing it would require something like https://github.com/golang/go/issues/52296, which was declined (to quote the reasoning):

But we would need to trade that benefit against the costs, and the costs as I see them are numerous: * go mod why and go mod graph would become more complex to use and interpret. (They would have to report not only the module graph, but the reasons for the module graph containing β€” and excluding β€” the nodes that it does.) * The implementations of the module loader and package loader would become much more tightly coupled. (They're already quite complex, but at least today we can have the package loader call into the module graph and not the other way 'round.) * Several operations that are fairly inexpensive today (such as running go test on a package imported by the main module) would become approximately as expensive as running go mod tidy.

I do agree that it would be a bit "nicer" if this was not the case. And agree that it could be that if we had tool lines then we should reconsider #52296 because it might encourage people who currently don't use tools.go to use tool lines, so the scenario might become more common. That said until we have support for it, and see whether people use it, and if they do whether it causes a significant increase in the type of problem described, it seems hard to know if it swings the cost benefit analysis from @bcmills.

Comment From: ConradIrwin

Perhaps the go run command should invoke tools instead of a new go tool command. We already use it for go run example.com/foo@latest, so why not also enable go run example.com/foo, where example.com/foo@vX.Y.Z is in go.mod, and go run foo, where example.com/foo@vX.Y.Z is in go.mod and there isn't a name ambiguity.

@willfaught go run example.com/foo will already work if you have a require example.com/foo in your go.mod. I would ideally like to support go X foo (see rationale in the proposal); to provide less typing and better caching.

Whether X is run or tool is more arguable; but I liked @rsc's suggestion that this is a "tool" not just a shorthand for run. Andtool as a word works better for a declaration line in go.mod and an argument to go get -tool.

Comment From: willfaught

We could simplify the problem by focusing only on tools that create or modify code or data. If such a tool is used to generate code in one part of a module, and other parts of the module need to interact with that code, then the tool module and its full deps should be reflected in the module graph. Even if no other such code exists in the module, the generated code may be intended to interact with code generated by the same tool in another module, in which case the tool module and its full deps should be reflected in the module graph. So the tool modules and their full deps should always be reflected in the module graph.

The responses to @bcmills's questions and points would then be:

Should each module that provides the package main for a tool be listed explicitly in the go.mod file?

Yes

Should the modules that provide packages transitively imported by tools be listed explicitly in the go.mod file?

Yes

Should tools be included in the all pattern, for commands like go test all?

Yes

Should the packages transitively imported by tools be included in all?

Yes

If we record tool versions explicitly but don't include tools in all, then go list -deps -test all no longer includes all of the packages that go mod tidy cares about.

N/A

If we include tools in all but don't include their transitive imports, then the set all is no longer closed over β€œimports”.

N/A

If we include tools and their transitive imports in all but don't record module versions for transitive imports, then go test all is no longer reproducible. (We don't have enough information in the pruned module graph for the test dependencies of those transitive imports.)

N/A

If we record module versions for the transitive imports of tools, then tool dependencies have a somewhat larger impact on downstream MVS results.

Just like any other dependency. Be careful what you add to your deps.

If people need to use a tool that doesn't create or modify code or data, then it has no impact on Go builds, so it doesn't need to be tracked by Go modules.

So in this sense, tools are just main packages contained within the module graph. We wouldn't need special go.mod directives or commands. They could work with go run example.com/foo@vX.Y.X (vX.Y.X being in the module), or go run example.com/foo (using vX.Y.X from the module), or go run foo (where there's no name ambiguity, using vX.Y.X from the module).

Comment From: rsc

The definition of "all" influences many things. After talking to @bcmills and @matloob, I believe (and I believe they agree) that modules providing tools and modules providing tool dependencies need to be included in "all". The reason is that commands like "go mod graph", "go mod verify", "go mod download", and "go mod vendor" are all keyed off of "all", as is the content of go.sum.

In particular, I think we definitely want someone who has tools configured to be able to run either "go mod download" or "go mod vendor" and then turn off their network and be able to do everything in that module, including run those tools. To do that, we need download to fetch those modules and we need vendor to include them in the vendor directory. The vendor use case in particular makes clear that these tools have to be part of the main build graph, since the vendor directory can only include one version of any given dependency.

To make the contributions from tools clearer in go.mod, I think we can add a // tool-only comment on go.mod lines that are only there to provide tools. Similarly tool dependencies would say // indirect, tool-only.

To echo a point made by others already, if you depend on some module M and that module has lines in its go.mod for support a tool that M uses, then those requirements do not get lifted into your go.mod just because you import a package P from M. The requirements that get lifted into your go.mod are only the ones for the modules you need to compile the package P, and that never uses any tools. So the decision about "all" only affects the local go.mod file; it does not affect the go.mod files of any users of a module.

Comment From: thepudds

The requirements that get lifted into your go.mod are only the ones for the modules you need to compile the package P, and that never uses any tools. So the decision about "all" only affects the local go.mod file; it does not affect the go.mod files of any users of a module.

I might be misunderstanding, but it seems it would affect the go.mod files of some users of a module in some cases, including because it can affect the local go.mod.

One of the classic issues is that a tools.go file can cause shared dependencies to be upgraded such that consumer of a module can see a higher version of the shared dependency recorded in the go.mod file of a module with the tools.go compared to the version the consumer would have otherwise seen.

One flavor would be a module M has a normal direct or indirect dependency on X that is used by M's package P (independent of a tools.go file), but M also has a tools.go file that pulls in some tool Y that also depends on X, which then pushes the version for X recorded in M's go.mod to a higher version than it otherwise would have been absent the tools.go file.

When some consumer module C then imports M's package P, the consumer module C then sees a higher version of X than consumer module C would have otherwise seen absent the tools.go in M, which might be a problem for the consumer module C.

Playing the role of X in the past have been various common modules like logrus or etcd, or observability packages, and so on.

As I said, maybe I'm misunderstanding, but would behavior like that be the case?

Comment From: thepudds

FWIW, here is a testcript file that attempts to show a concrete example of my immediately prior comment.

I put it together quickly w/ some shortcuts, so maybe one of the shortcuts is invalidating the example, or maybe I've made another mistake, or maybe the tools.go analogy is not appropriate.

To run, download https://gist.github.com/thepudds/67b93bc1b1434b2d7735ff83c5db2a34 as example.txt, then:

go install github.com/rogpeppe/go-internal/cmd/testscript@latest
testscript example.txt

In short: * Playing the role of the shared module X is logrus. * A module m uses logrus, and to start, m does not have a tools.go file. * There is a also consumer module consumer that depends on m and also uses logrus.
* m then later adds a tools.go file, which in this example causes a problem for consumer.

Comment From: bcmills

@thepudds, in general we expect users to resolve compatibility problems by upgrading rather than pinning, but for the β€œpinning” case we do still have the exclude directive β€” and exclude (unlike replace) ought to play nicely with graph pruning and go install pkg@version.

So I think you're right that the tool dependencies can occasionally surface compatibility problems that require some extra work to resolve, but I also think that we already have the tools needed to resolve them.

(In your example: at the point where the tool is added, one might also add exclude github.com/sirupsen/logrus v1.9.3 to the go.mod file.)

Comment From: rsc

Double-checked with @bcmills, and we both think this is ready to go. @ConradIrwin are you still interested in implementing it?

Have all remaining concerns been addressed?

Comment From: myitcv

Thanks for the various responses above, @rsc.

I might have missed an answer to https://github.com/golang/go/issues/48429#issuecomment-1629646464, because your response appeared to address a different but related point.

To echo a point made by others already, if you depend on some module M and that module has lines in its go.mod for support a tool that M uses, then those requirements do not get lifted into your go.mod just because you import a package P from M. The requirements that get lifted into your go.mod are only the ones for the modules you need to compile the package P, and that never uses any tools. So the decision about "all" only affects the local go.mod file; it does not affect the go.mod files of any users of a module.

That said, I don't understand the phrase "do not get lifted" so might simply be missing your point.

Taking your example:

  • My module imports package P from module M
  • M has lines in its go.mod to support a tool T that M uses

Is my build graph influenced by T's dependencies?

I could well be missing some important detail here about this proposal with respect to dependency pruning, but I can't see how the answer to that is question is anything other than "yes".

That being the case, the only concern I have is that modules which look to keep their dependency surface small for the sake of their users might well eschew this technique. Because the inclusion of a tool dependency increases the dependency surface for their users.

Thanks

Comment From: bcmills

Is my build graph influenced by T's dependencies?

Yes, but if they cause a problem for you, you can notch them out with exclude directives. (That's essentially the same as for any other transitive dependency that doesn't affect the package(s) you import from that module.)

That being the case, the only concern I have is that modules which look to keep their dependency surface small for the sake of their users might well eschew this technique.

FWIW, there is already an analogous problem with the //go:build tools technique in use today. So this proposal at least doesn't make things any worse on that front.

Comment From: myitcv

Is my build graph influenced by T's dependencies?

Yes, but if they cause a problem for you, you can notch them out with exclude directives.

Thanks for confirming. However that solution doesn't sound like it scales well in terms of advice to users of my module who were looking for a small dependency surface.

FWIW, there is already an analogous problem with the //go:build tools technique in use today. So this proposal at least doesn't make things any worse on that front.

Whilst that was the original means of cleanly depending on tools, go run $pkg@$version is I think the better reference point today.

(I appreciate however that the go run approach complicates how I review dependencies of tools used in the development of my module.)

Comment From: ConradIrwin

@rsc yes, I'm still happy to take this on (though I can't commit to a timeline, I'll send changes as they're ready).

What's the best way to get eyes on them, should I assign to you @bcmills?

I have a first one for the syntax here: https://go-review.googlesource.com/c/mod/+/508355.

Comment From: bcmills

@ConradIrwin, yes, you can assign reviews to me and @matloob (as you have done).

Comment From: rsc

Based on the discussion above, this proposal seems like a likely accept. β€” rsc for the proposal review group

Comment From: Merovius

AIUI, the proposal in the top-post is not what is now being marked Likely Accept. Would it be possible to edit the top-post to either reflect the actual proposal or to at least link to a comment containing it? This issue is a bit long and it seems a big ask to read it all, just to find what is even proposed.

Comment From: ConradIrwin

The proposal is here: https://go-review.googlesource.com/c/proposal/+/495555.

I'd love help getting that merged so it has a more permanent feeling to it, and I like @Merovius idea of editing the top comment (but I can't do that either).

Comment From: Merovius

@ConradIrwin Thanks. That looks like an exciting set of features.

Comment From: thepudds

FWIW, I updated the top comment here with a link to the proposal.

Comment From: willfaught

The proposal is here: https://go-review.googlesource.com/c/proposal/+/495555

I missed the part where we decided to use go tool instead of go run. Can someone point me to it?

This design uses "tool pkg" directives in go.mod, but I thought we were going to use comments (quoting Russ):

To make the contributions from tools clearer in go.mod, I think we can add a // tool-only comment on go.mod lines that are only there to provide tools. Similarly tool dependencies would say // indirect, tool-only.

Comment From: ConradIrwin

@willfaught I don't think go tool X vs go run X has been particularly debated. The original suggestion came from this comment: https://github.com/golang/go/issues/48429#issuecomment-1542860441, and I incorporated it into the next revision of the proposal. Is there a reason to chose go run X instead?

(In my mind go run X is slightly easier to type, but has the downside that run is already somewhat overloaded. On the flip side go tool X lets us use one word for this feature throughout, and makes it clearer that this isn't quite the same as run).

This design uses "tool pkg" directives in go.mod, but I thought we were going to use comments (quoting Russ):

The comments would be in addition to the tool directives, and were proposed to help people understand which require directives are coming from tools.

Comment From: willfaught

Thanks for pointing me to that. Makes sense why it's there. Thanks for explaining the comments.

On the flip side go tool X lets us use one word for this feature throughout, and makes it clearer that this isn't quite the same as run

How would it be different than run? I see in the design doc that go install tools would be a thing, but wouldn't go tool stringer have to basically do what go run does anyway? We can't assume the stringer in GOBIN is the version required by the module, even after go install tools.

There might be complications with using go tool:

  • Between Go team tools and third-party tools, which should take precedence for name conflicts? And how do you unambiguously refer to one or the other?
  • If Go team tool names take precedence, then what if the Go team wants to add a new tool? Will they be wary of "breaking" people by adding a name that might conflict with an existing third-party tool?

Comment From: fzipp

@willfaught Your last questions have been discussed before: https://github.com/golang/go/issues/48429#issuecomment-1550475817

Comment From: Merovius

@willfaught One difference to go run (presumably) is this line from go help run:

The exit status of Run is not the exit status of the compiled binary.

At least I would hope that go tool foo uses the exit status of foo? It also differs (AIUI) in that the compiled tool is cached.

Personally, I like go tool foo, because go run is already heavily overloaded. There is a syntactical overloading in that go run foo can also mean "run the main package in $GO{PATH,ROOT}/src/foo". For example, if I currently use go run foo in a small module, it says package foo is not in std. I think in practice, given that there this is only used in module-mode and that the design restrict the tool-aliasing to a single path-component and that there is no main package in a direct subfolder of $GOROOT/src, there should be no actual conflict. But the context-dependency and overloading of go run still makes this icky, IMO.

go tool foo also provides a clearer association between what is being run and the tool directive of go.mod.

Personally, I would prefer if we could give preference to user-defined tools over upstream-defined tools, as it would solve any potential breakage issues - similar to how identifiers can shadow predeclared identifiers, thus preventing code from breaking if we add new builtins - but I can see that this is probably not practical, given that the go tool itself might execute go tool to do its job.

Comment From: gopherbot

Change https://go.dev/cl/495555 mentions this issue: design/48429-go-tool-modules.md: new proposal

Comment From: gopherbot

Change https://go.dev/cl/521959 mentions this issue: cmd/go: add tools to "all"

Comment From: gopherbot

Change https://go.dev/cl/521958 mentions this issue: cmd/mod/edit: add -tool and -droptool support

Comment From: rsc

No change in consensus, so accepted. πŸŽ‰ This issue now tracks the work of implementing the proposal. β€” rsc for the proposal review group

Comment From: jimmyfrasche

sorry if I missed it but will there be an easy way to list the tools available? Handy for exploring new code or for a tool to check if there's a local version that should be invoked instead

Comment From: ConradIrwin

@jimmyfrasche yes, you’ll be able to run go tool with no arguments to get a list

Comment From: justinfx

In the case that two tool directives end in the same path segment, go tool X will error.

Any reason not to support aliasing the tool name, the same way imports work? If I have two tools named stringer, should I be able to do:

tool github.com/path/proj/cmd/stringer 
tool stringer2 github.com/path/proj2/cmd/stringer 

That would allow me to choose the output name of the tool.

Comment From: ConradIrwin

We considered this, but decided it added additional complexity for little reason. Currently tools are installed to $PATH, and so tend to have unique names. You will still be able to use the full package path to distinguish between the two if this situation does arise

Comment From: whiten

The spec says:

Subsequent runs of go tool X will then check that the built binary is up to date, and only rebuild it if necessary to speed up re-using tools.

Is it decided whether the "up to date" check here will compare the Go version used to build a cached tool against the current Go toolchain? My company ran into a golang.org/x/tools tools warning about mismatched Go toolchains from https://github.com/golang/tools/blob/559c4300daa4efe55422df9bba86d125cdf1d9ef/go/packages/packages.go#L973 using gqlgen, and it would be great if the cached runs from go tool continued to resolve this as go run does today. Thanks!

Comment From: ConradIrwin

@whiten That is a great call-out, thank you. I believe that is already handled by Go's caching, but will make sure that that is the case.

Comment From: ConradIrwin

@bcmills / @matloob I've sent a few intro PRs to this several weeks ago, but haven't heard anything back. I'm not familiar with Gerrit, so please let me know if there anything else I need to do to ask for a review.

  • https://go-review.googlesource.com/c/mod/+/508355
  • https://go-review.googlesource.com/c/go/+/521958
  • https://go-review.googlesource.com/c/go/+/521959

(If the changes make no sense, I'm happy to jump on a phone call and talk them through if that's helpful, otherwise happy to collaborate async).

Comment From: bcmills

@ConradIrwin, my apologies for the delay. I am still planning to get to these reviews, but it may take me a while to find the bandwidth for a proper review.

Comment From: ConradIrwin

Thanks! I will also stop blocking on your review then, and try to send the rest of the patch set. I think it's unlikely that changes to the patches submitted so far will require significant rework of later patches.

Comment From: gopherbot

Change https://go.dev/cl/534817 mentions this issue: cmd/go: add support for mod tools

Comment From: piroux

@ConradIrwin @bcmills Any updates ? Are the reviews of the 3 mentioned PRs on Gerrit still planned?

Comment From: ConradIrwin

@piroux thanks for checking in!

I have one more commit to send (to add go get support), and am still awaiting review.

The go get was a bit trickier than expected (I hadn't considered that go get supports wildcards), but I hope to have that in a good state by the end of January, if not sooner.

Comment From: gopherbot

Change https://go.dev/cl/563175 mentions this issue: cmd/go: add support for go get -tool

Comment From: ConradIrwin

@bcmills @matloob I've uploaded the final pieces of this patchset (the changes to go get), and would love to try and get this merged for go1.23.

Please let me know if it would be helpful to talk through any of these changes, otherwise I look forward to your feedback in Gerrit!

Comment From: bcmills

@ConradIrwin, unfortunately I won't be able to review these changes after all; I'm leaving Google (and the Go team) on this Friday, March 15.

Comment From: ConradIrwin

@bcmills wow, big change! I hope that everything turns out well for you.

@rsc / @matloob is there anyone else that would be interested in taking this on?

Comment From: acramsay

Hi all. I've been quietly watching this work for a few months now and I'm quite excited for this new functionality! That said, the last comments on this issue are a little discouraging. If possible, I would appreciate an update. Has anyone from the Go team taken over the code review? Is this progressing?

Thanks everyone for working toward this contribution!!

Comment From: matloob

Unfortunately, at this point I'm not sure if we're going to have the time to review this before the freeze.

That said, I'll try to start reviewing the CLs at the bottom of the stack and we can submit them together if we are able to get the stack done in time.

Comment From: ConradIrwin

@matloob I would love to have it reviewed before the next freeze if that's possible :D. Should I connect with you when the window opens again?

Comment From: matloob

Yes that sounds good. I'll try to see if I can start during the freeze so we can maybe make progress before the window opens.

Comment From: ConradIrwin

Great, thank you!

On Thu, Apr 18 2024 at 11:59, Michael Matloob @.***> wrote:

Yes that sounds good. I'll try to see if I can start during the freeze so we can maybe make progress before the window opens.

β€” Reply to this email directly, view it on GitHub https://github.com/golang/go/issues/48429#issuecomment-2064764881, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAXAQGM3MK3HNJZT2V6BCTY6ACY7AVCNFSM5EGC4OSKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TEMBWGQ3TMNBYHAYQ . You are receiving this because you were mentioned.Message ID: @.***>

Comment From: meling

Just pinging to check on the progress of this one; since there is an implementation that mainly requires review it seems like a relatively low-hanging fruit to get this done before the window opens ;-)

Comment From: matloob

I've started the reviews of the CLs.

Comment From: pierrre

With this proposal, does the go.mod file contain the list of transitive dependencies of the tools ?

Comment From: matloob

@pierrre Yes. Tools will be treated similar to as if they were a package of the main module, so a tidy go.mod will contain all the dependencies needed to build it.

Comment From: stevenh

Nice to see this being worked on.

Comment From: gopherbot

Change https://go.dev/cl/614555 mentions this issue: cmd/mod/edit: disallow relative tool paths

Comment From: mcandre

Reminder that buildtime dependencies installed via out of band methods are likely missing from SBOM's sent to SCA tools.

govulncheck, Snyk, Dependabot, etc. look primarily in go.mod for information about Go dependencies.

A few container SCA tools like Docker Scout, Red Hat Quay Container Security Operator, Snyk Container may happen to match CVE's for vulnerable buildtime Go dependency tools. But only for Go tools installed via OS package managers, and likely only for Go tools that are not removed from the final image using modern, multi stage images.

Comment From: gopherbot

Change https://go.dev/cl/613095 mentions this issue: cmd/go: cache executables built for go run

Comment From: gopherbot

Change https://go.dev/cl/630695 mentions this issue: cmd/go/internal/tool: set Internal.ExeName on tool's package

Comment From: meling

Playing with tip I can do:

go install tool github.com/alta/protopatch/cmd/protoc-gen-go-patch
go get -tool google.golang.org/protobuf/cmd/protoc-gen-go

But these do not appear to work (tools metapackage):

go install tools
go build tools

Is the tools metapackage planned for Go1.24?

Comment From: matloob

@meling Apologies- the design doc is now a little out of date.

We changed the naming of the metapackage to tool to reduce user confusion: now users won't have to remember whether it's tool or tools. It's always tool: the go.mod directive is named tool, the metapackage is named tool, the go subcommand is tool, and the argument to go get is tool.

A few other major changes in the final tool support vs. the design doc:

  • Tools are supported in workspaces: just like within a single module if there's an ambiguous tool name the full package path needs to be specified to go tool an the tool metapackage consists of the union of each of the work modules' tool metapackages.
  • Tool executables are cached using the go build cache using the new executable caching support (#69290) rather than in the $GOCACHE/tool directory.
  • Relative tool paths are no longer supported (https://go.dev/cl/614555).

Comment From: dmitshur

Adding a release blocker to track mentioning this new cmd/go feature in the Go 1.24 release notes.

Comment From: GoVeronicaGo

@matloob @samthanawalla to confirm the code has been completed and only release notes are pending for 1.24.

Comment From: mtibben

@dmitshur the Modules FAQ may also need to be updated

Comment From: gopherbot

Change https://go.dev/cl/632555 mentions this issue: cmd/go: add tool meta-pattern to go help packages

Comment From: gopherbot

Change https://go.dev/cl/632595 mentions this issue: _content/doc: document module tools

Comment From: gopherbot

Change https://go.dev/cl/632556 mentions this issue: doc/next: introduce module tools

Comment From: gopherbot

Change https://go.dev/cl/632596 mentions this issue: Modules: add go1.24 tools

Comment From: mknyszek

The RC is planned for next week, and we need a full draft of the release notes before then. Please prioritize writing the release notes for this. Thanks!

Comment From: thepudds

Using -modfile when getting a tool with the -tool flag seems to work:

$ go get -modfile=tool.mod -tool golang.org/x/tools/cmd/stringer
go: downloading golang.org/x/tools v0.28.0
[...]

But using -modfile when invoking a tool seems to not work currently:

$ go tool -modfile=tool.mod stringer
flag provided but not defined: -modfile

I don't know if that was a deliberate choice (including I didn't notice -modfile referenced in the design doc), but it would be nice if -modfile was supported when invoking a tool.

Checks above were using tip from earlier today (c46ba1f), and it might be I misunderstood the new capabilities or otherwise made a mistake.

Comment From: ConradIrwin

@thepudds This will be fixed by https://go-review.googlesource.com/c/go/+/632575

Comment From: fredrikaverpil

Is the intention that this should work?

$ gotip mod -modfile=tools.mod init foo
go mod: unknown command
Run 'go help mod' for usage

I'm on gotip version: go version devel go1.24-c46ba1f Wed Dec 4 22:20:08 2024 +0000 darwin/arm64

Comment From: thepudds

Hi @ConradIrwin, great, thanks! πŸš€πŸŽ‰

@fredrikaverpil, if you want to use -modfile with go mod init, I think the proper spelling is with the -modfile flag after the init:

$ go mod init -modfile=tool.mod foo

Comment From: gopherbot

Change https://go.dev/cl/632575 mentions this issue: cmd/go: add -modfile and -modcacherw to go tool

Comment From: ConradIrwin

Now this is out in RC, I've put a blog post explaining some of the motivation for the change here: https://cirw.in/blog/go-tools

Comment From: mcandre

For clarification:

  • Where do the tool executables live, $GOPATH/bin?
  • Do you have to invoke the tools with a go tool prefix, like npx and bundle exec, or nah?
  • I assume that the same version identifier syntaxes, including VCS tags as well as commit ID replace directives, work for tools as they do for non-tool Go modules, right?

Comment From: meling

Quick reply with this blog post. It doesn't answer all your questions, but shows (at the bottom of the post) that you can install tools both globally and locally for the current module.

Comment From: ConradIrwin

@mcandre * tool executables live in the build cache, and are built on demand (and expired when unused). This lets go tool select which version of a tool is used depending on the active module. * you invoke them with go tool <toolname>. If you want to (and versioning is not a concern) you can install all tools for the current module to GOBIN with go install tool. * Yes. Tool dependencies participate in the same module graph as imported dependencies, and require/replace/exclude directives apply in the same way.

@meling nice post! You might want to recommend go get -tool instead of go mod edit followed by go get -u tool` save people a command or two.

Comment From: gopherbot

Change https://go.dev/cl/638296 mentions this issue: _content/doc/go1.24: fix meaning of -u in 'go get -u tool'

Comment From: nikolaydubina

[!NOTE] look comments bellow about go mod pruning. this comment may not be as relevant, given tools modules will get pruned out.

surprised to see this accepted with approach of merging tool dependencies versions together with code dependencies. I am all for versioning dependencies of tools, but it has to be separate from the versioning of artefacts used to work with. this is to keep clarity of what actually being used in compilation, linking, embedding of artefacts.

if go.mod does not mean anymore that these dependencies included in compilation, then it looses its meaning. what you see may or may not be used (as in included into binary, linked, or embedded) in your project. dependency tracking becomes a bag of everything, more obscure and less reliable. this is step away of clarity and usefulness.

this also goes against ethos of minimal dependencies. so far dependencies in Go were fairly stable and minimal. now go.mod will jump versions and include/exclude dependencies all the time, given tools are not as stable.

projects that did tools.go may not see difference. but projects who intentionally designed to avoid including tools dependencies, now will see their go.mod polluted. (also, did anyone had survey for opposing views or opposing design prevalence in Go ecosystem? IMHO, it strongly looks like this feature of Go was tilted to make tools.go approach for everyone. old approach allowed freedom for people to choose solution to go with, now one standard being enforced on everyone). thus, projects trying to keep small dependencies will be forced to not use tools, which is likely not a good step for Go ecosystem.

including tools versions and their dependencies mixed with source code seem a bad way forward.

and what about transitive dependencies? should we expect dependency-apocalypsis when some random tool used in some project you depend on appear in your go.mod, and in all go.mod files of projects that are depending on you?

Comment From: pete-woods

For what it's worth I intend to continue using a separate tools/go.mod to manage tool versions with this new feature, like I do today with the existing "hack". I've got a wrapper bash script, so I think it'll just be changing the calls in that to use ~(cd tools && go tool xxx)~ instead (edit: apparently go tool -modfile tools/go.mod xxx will also work).

I don't want tool dependencies cluttering up the dependency tree either. Good libraries are very particular about minimising deps, and you can look at their go.mod and see it's not full of bloat when you decide to use them.

Comment From: znkr

and in all go.mod files of projects that are depending on you

Note that since go1.17, go.mod files of dependent projects are pruned to include only the dependencies necessary for building the main modules (https://go.dev/ref/mod#graph-pruning).

Comment From: DmitriyMV

@pete-woods you don't need separate folder, you can use -modfile flag. See this https://github.com/golang/go/issues/48429#issuecomment-923429528.

Example:

~> go mod init -modfile=tools.mod <module_name>
~> go get -modfile=tools.mod -tool golang.org/x/tools/cmd/goimports@latest
# or
~> go get -u -modfile=tools.mod -tool golang.org/x/tools/cmd/goimports@latest

Comment From: pete-woods

@pete-woods you don't need separate folder, you can use -modfile flag. See this https://github.com/golang/go/issues/48429#issuecomment-923429528.

Good tip!

Comment From: nikolaydubina

Due to module pruning, when you depend on a module that itself has a tool dependency, requirements that exist just to satisfy that tool dependency do not usually become requirements of your module. β€” tools doc

you are right, with go1.17 module graph pruning, including tools may be okay. if they are going to get pruned.

maybe good to mention this better in go tool and v1.24 release docs. lots of people confused about this.

Comment From: mcandre

Wrapper bash scripts break many environments. Native Windows, bare Alpine, bare Redox. Various and sundry FOSS projects would be portable except for their vendor locked build systems.

Comment From: liggitt

including tools versions and their dependencies mixed with source code seem a bad way forward.

I agree with this, and in projects I work on, we'll still keep a separate go.mod file for tools, so that MVS on our main library modules aren't influenced by tool dependencies.

Being able to reference main packages of tools and have the tools referenced specifically as tools from our distinct tools go.mod file is nice.

Comment From: mcandre

Does go mod not create a separate section for dev dependencies, like RubyGems and NPM have done for years?

Comment From: liggitt

Does go mod not create a separate section for dev dependencies, like RubyGems and NPM have done for years?

No

Comment From: gophun

agree with this, and in projects I work on, we'll still keep a separate go.mod file for tools

Then do it, this feature gives you all you need to do it, as others have pointed out before.

Comment From: seankhliao

please use the various forums for general discussion / question & answers (rather than rehash earlier discussions). if there are bugs please file them as new issues.

Comment From: mcandre

agree with this, and in projects I work on, we'll still keep a separate go.mod file for tools

Then do it, this feature gives you all you need to do it, as others have pointed out before.

Where is documentation for this style of usage? I guess installing/updating just needs to use a flag on go mod to divert away from the normal go.mod file. By convention tools.mod.

go mod v1.25 would do well to automatically look for and apply this file.

Comment From: pete-woods

It's not particularly helpful to hide people's answers to genuine questions as off topic for a ticket that is likely to appear at the top of search results.

Comment From: zigo101

It looks that using go get tool to update all tool dependencies is a bad idea. It should be go get -tool all instead. See https://github.com/golang/go/issues/71437

Comment From: mvdan

Just wanted to say thanks so much for working on this feature! Just for the caching of binaries alone, this is amazing - I'm seeing go generate ./... regularly drop from e.g. 8s to 1s thanks to that :) The UX is a healthy improvement as well.