-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
make the makefile slightly more elegant #86
Conversation
Thanks for the PR mpldr. Some comments below. There are reasons to have the vendor directory in the repo:
Not all Makefile changes seem like a clear win (or more elegant) to me, but some of the incremental build changes may be useful. Details:
As for performance of incremental builds. This is approximately what I'm getting for an incremental build without changes with the current Makefile (durations in real/wall time, not user/system time):
In general, if changes make the makefile less readable, they make simply not be worth it... |
On Sun Oct 22, 2023 at 2:08 PM CEST, Mechiel Lukkien wrote:
1. The repo stays self-contained, not requiring additional downloads (dependencies) to build. Only the Go toolchain is currently required. I have a separate CI pipeline that builds without making any requests.
2. By including dependencies, diffs show the changes in dependencies. This helps both to review changes in dependencies, and it sometimes helps to show that an innocent-looking addition of a dependency actually pulls in a lot of code, which can be a reason to choose another approach.
Sure, that is a valid concern. I'll drop the commit from the PR.
Not all Makefile changes seem like a clear win (or more elegant) to me, but some of the incremental build changes may be useful. Details:
- Only generating new docs and api json files when the source files have changed could be good. I'm not so sure the current dependencies for the api files are complete.
For that, the list can easily be amended. Only direct inputs have to be
considered though. So if you have some process that updates the JSON
from some source, that step would ideally be added to the Makefile as
well.
For the docs, they should also depend on the Go files in the main package that implement the commands.
I find the gendoc.sh file a bit scary, to be quite honest. This might be
because I am more accustomed to manpages and don't expect this to be
inside the API documentation.
Depending on all go files would be safer, but may also defeat some of the purpose. Perhaps there is a Go command to fetch package dependencies?
The Go toolchain exposes a crazy amount of details, I am not too sure on
the specifics in this case, though.
Question is if it ends up being faster. See below for some more details on incremental build speed.
Speed is only part of the reasoning, another – in my eyes more important
part – is the removal of the "oh, I forgot to update the docs"-factor.
The Makefile does that for you.
- To replace the "go" command, I typically just make sure to set a PATH that has another "go" earlier in the list before running make. No need to add support for replacing commands in all Makefiles. The same would apply to other commands. For example, I regularly run `goprev make build test`, with:
While this might be an easier solution once you have such a script,
I would argue that if one does not have such a script, calling
`make GO=~/go/bin/go1.13` might be a tad easier.
- Are you sure $gosrc is complete?
As long as it has a .go extension, it is included. Even if it has to be
regenerated.
In my experience, the Go tooling is solid and fast enough to let it handle incremental builds. Trying to do it again in a Makefile is bound to get out of date. So I would say it's only worth it if it gives a significant performance boost, and I don't think that's the case.
As stated above, the performance concerns are secondary. If you were to
build a huge npm project, the performance gains would no longer
negligible.
- I would not use -trimpath by default. I use the full paths in panic backtraces. With -trimpath, clicking a path:line wouldn't open the file in editors because the path doesn't exist. I think GOFLAGS is also picked up from the environment, so you could just run `GOFLAGS=-trimpath make`, or set it once in the shell you're running make in.
I think for general building, having `-trimpath` is actually beneficial,
because the paths are easier to resolve when debugging. Especially since
most users will either build from source, or take a pre-made build and
will not look at these files. I would approach it using "let it do the
sensible thing for most people" by default.
- I find target name "lint" misleading. It does checks, but it doesn't call golint, the (once) standard linter in Go.
Linting describes a process and not a specific program.
- I'm not a fan of ".PHONY". I certainly don't find it elegant. And I don't think they are needed, the targets aren't created in their build steps. Only if someone manually creates files named "check", "test", "test-race", would there be a problem. That will be noticed soon enough.
.PHONY is the designed way to mark targets that do not produce a file.
It is "correct" to do it, but if you prefer it can be removed.
- I tend to stay away from the magic makefile variables like "$@". It makes the makefiles unreadable to the untrained eye. And even though I use and write makefiles, I'm eternally untrained because I don't edit them often enough. Just having a few paths duplicated is more clear, even untrained eyes will understand what's going on. Long dependency lists may be more readable as variable though.
I mainly use them, because they allow me to spot patterns like "oh, hey,
I am always doing XYZ", like in this case web*/*api.json. Only
discrepancy is the Webmail one, otherwise these could be combined into
a single instruction.
I don't consider myself a make-wizard by any means and macros scare the
shit out of me, but overall I do like my makefiles nice and small with
as few rules as necessary.
- given the new risk of incomplete dependencies
To be honest, I don't see that risk, but I could look into `go tool`
- go vet is slow. Perhaps vet can be started early in the background? I don't know of a Makefile feature for that. Not sure if a shell script with a background job is worth it.
-j would be the switch for that, but it requires that all generated
files are marked as dependencies, if applicable, otherwise you may run
into issues where a dependency wasn't made in time. (one of the reasons
to be thorough here, imho)
- The easiest time savings appear to be not running gendocs, sherpadoc and sherpats if not needed.
Which is already a good starting point.
In general, if changes make the makefile less readable, they make simply not be worth it...
I agree in part, here. A reliable makefile that is harder to read, than
a simpler, not as reliable one does have its worth, but making it more
complex just for the sake of it is not something I am a fan of.
…--
Moritz Poldrack
https://moritz.sh
Colors may fade.
|
webmail/api.ts: webmail/api.json $(gomod) | ||
$(GO) run github.com/mjl-/sherpats/cmd/sherpats -bytes-to-string -slices-nullable -maps-nullable -nullable-optional -namespace api api <$< >$@ | ||
|
||
.mox.tmp: main.go |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would love to get rid of this abomination, but with gendoc.sh
creating a circular dependency with the mox
target, this has to stay as long as gendoc.sh is a thing.
This also adds the possibility to use the new docker compose subcommand instead of the standalone command.
> - Only generating new docs and api json files when the source files have changed could be good. I'm not so sure the current dependencies for the api files are complete.
For that, the list can easily be amended. Only direct inputs have to be
considered though. So if you have some process that updates the JSON
from some source, that step would ideally be added to the Makefile as
well.
The API JSON files are created in the Makefile, by calling sherpadoc. Like this
in the PR:
webadmin/adminapi.json: $(wildcard webadmin/*.go) $(gomod)
cd webadmin && CGO_ENABLED=0 $(GO) run github.com/mjl-/sherpadoc/cmd/sherpadoc -adjust-function-names none Admin >$(notdir $@)
However, that doesn't seem complete to me: webadmin/*.go import files from other
packages, and exposes their types in the API (json). Of course we can add more
dependencies, but there is always a risk it isn't complete. That increased
risk/brittleness has to be offset by something, e.g. build speedup.
> For the docs, they should also depend on the Go files in the main package that implement the commands.
I find the gendoc.sh file a bit scary, to be quite honest. This might be
because I am more accustomed to manpages and don't expect this to be
inside the API documentation.
Yeah. The idea is that the subcommands carry their own documentation. I'm not
following the old unix approach of separate manual pages anymore (I have done
that in the past, but times are changing). Having docs included in the binary is
easier for packaging, one less step. It also helps with keeping docs up to date:
They are in the code, instead of in a separate file that is forgotten.
The reason to put it in a big comment in a Go file as well is to automatically
get readable documentation on pkg.go.dev site. It saves me (so far) having a
separate place (website) to publish documentation. So, it's just a way to have
to do less work, keeping the code/project maintainable/feasible.
> Question is if it ends up being faster. See below for some more details on incremental build speed.
Speed is only part of the reasoning, another – in my eyes more important
part – is the removal of the "oh, I forgot to update the docs"-factor.
The Makefile does that for you.
The Makefile already did that: It just always does a full build. With the PR
changes, there is only increased risk of not building something that needs to be
rebuilt. So if we don't get a build speed up, there isn't really a point in
making all these changes. The makefile is just a convenient way to run the
required commands to build the project.
In the past/with slower compilers, you would specify all the dependencies
carefully so you wouldn't have long waits for work that didn't really need to
happen. With the tradeoff between "faster" and "accidentially not rebuilding
after a dependency changed", given that we don't become much faster, the risk of
accidentially not building seems too high to me.
> - To replace the "go" command, I typically just make sure to set a PATH that has another "go" earlier in the list before running make. No need to add support for replacing commands in all Makefiles. The same would apply to other commands. For example, I regularly run `goprev make build test`, with:
While this might be an easier solution once you have such a script,
I would argue that if one does not have such a script, calling
`make GO=~/go/bin/go1.13` might be a tad easier.
True, but what/who are we optimizing for? Any developer wanting to build with a
different go binary (or docker binary), which is quite rare, will know how to make it happen.
Developers can just edit the Makefile. Because there is currently no magic, it's
clear what needs to be edited.
> - Are you sure $gosrc is complete?
As long as it has a .go extension, it is included. Even if it has to be
regenerated.
It only includes .go files. But there are also embedded non-go files in the
binary. If those change, the binary should be rebuilt as well.
> - I would not use -trimpath by default. I use the full paths in panic backtraces. With -trimpath, clicking a path:line wouldn't open the file in editors because the path doesn't exist. I think GOFLAGS is also picked up from the environment, so you could just run `GOFLAGS=-trimpath make`, or set it once in the shell you're running make in.
I think for general building, having `-trimpath` is actually beneficial,
because the paths are easier to resolve when debugging. Especially since
most users will either build from source, or take a pre-made build and
will not look at these files. I would approach it using "let it do the
sensible thing for most people" by default.
Using trimpath does not make it easier to resolve paths for me. Absolute paths
are unambigous. I can just click to open them. For trimmed relative paths, they
need to be evaluated against the correct working directory. I suppose your
workflow is different somehow? Perhaps you are thinking about getting a panic
backtrace from a user? I'm mostly concerned with local development and
debugging.
I expect non-technical users to download a binary from gobuilds.org, which are
built with trimpath.
I expect slightly more technical users to just call "go install
***@***.***". They could add -trimpath, but the full paths to the
source can only help in case of problems.
I expect only developers changing the code to use the Makefile for building.
They are likely developing & testing locally. So absolute paths are useful there.
> - I find target name "lint" misleading. It does checks, but it doesn't call golint, the (once) standard linter in Go.
Linting describes a process and not a specific program.
I know, doesn't change that I think it's misleading. (:
"make lint" makes you think it could be running "golint".
"make check" makes you think "this will run checks". It calls
the well-known tool staticcheck. I think this is the path of
least surprise.
> - I'm not a fan of ".PHONY". I certainly don't find it elegant. And I don't think they are needed, the targets aren't created in their build steps. Only if someone manually creates files named "check", "test", "test-race", would there be a problem. That will be noticed soon enough.
.PHONY is the designed way to mark targets that do not produce a file.
It is "correct" to do it, but if you prefer it can be removed.
I'm not really looking for a "correct" way to use Makefiles. I'm primarily
interested in using Makefile to run the needed commands, only in a pragmatic way.
> - given the new risk of incomplete dependencies
To be honest, I don't see that risk, but I could look into `go tool`
Both building the binary and the JSON files in the PR have incomplete
dependencies, so it's clearly a pretty big risk!
> In general, if changes make the makefile less readable, they make simply not be worth it...
I agree in part, here. A reliable makefile that is harder to read, than
a simpler, not as reliable one does have its worth, but making it more
complex just for the sake of it is not something I am a fan of.
It's not reliable vs simple. It's "simple and reliable but possibly doing more
work than strictly needed" versus "more complicated, doing the minimum work to
build to get faster builds, but risking that the targets aren't always rebuilt
when they should".
I think the proposed changes aren't worth the complexity. I know I'm more strict
that way than most people. Only changes that keep the Makefile simple, improve
build speed and don't increase (by too much) the risk of not building enough,
could be worthwhile.
|
I think I may have misinterpreted the purpose of this Makefile then :) it's probably best to just close this PR as I don't think this is going to lead anywhere. |
No description provided.