The bug reappeared after all, and the fix introduced a different bug. We
want to release 2.27 imminently so there is no time to do a proper fix,
which appears to require a larger reworking. Hopefully we will have it
for 2.28, however.
This reverts commit c98525235f.
... as intended.
Requirements:
- don't build fresh libraries for each git commit
- have git commit in the CLI
Bug:
- echo ${version} went into the wrong file => use the fact that it's
a symlink, not just for reading but also for writing.
Logging to another Logger was kind of nonsensical - it was really just
an easy way to get it to write its output to stderr, but that only
works if the underlying logger writes to stderr.
This change is needed to make it easy to log JSON output somewhere
else (like a file or socket).
It is unused in Nix currently, but will be used in Hydra. This reflects
what Hydra does in https://github.com/NixOS/hydra/pull/1387.
We may probably to use it more widely for better SSH store performance,
but this needs to be subject to more testing before we do that.
This is a first step towards PR #10760, and the issues it addresses.
See the Doxygen for details.
Thanks to these changes, we are able to drastically restrict how the
rest of the code-base uses `ParseDerivation`.
Co-Authored-By: HaeNoe <git@haenoe.party>
I do not believe there is any problem with computing
`hashDerivationModulo` the normal way with impure derivations.
Conversely, the way this used to work is very suspicious because two
almost-equal derivations that only differ in depending on different
impure derivations could have the same drv hash modulo. That is very
suspicious because there is no reason to think those two different
impure derivations will end up producing the same content-addressed
data!
Co-authored-by: Alain Zscheile <zseri.devel@ytrizja.de>
"content-address*ed*" derivation is misleading because all derivations
are *themselves* content-addressed. What may or may not be
content-addressed is not derivation itself, but the *output* of the
derivation.
The outputs are not *part* of the derivation (for then the derivation
wouldn't be complete before we built it) but rather separate entities
produced by the derivation.
"content-adddress*ed*" is not correctly because it can only describe
what the derivation *is*, and that is not what we are trying to do.
"content-address*ing*" is correct because it describes what the
derivation *does* --- it produces content-addressed data.
This is a big step documenting the store layer on its own, separately from the evaluator (and `builtins.derivation`).
Co-authored-by: Robert Hensing <roberth@users.noreply.github.com>
Curl creates sockets without setting FD_CLOEXEC/SOCK_CLOEXEC, this can
cause connections to remain open forever when using commands like `nix
shell`
This change sets the FD_CLOEXEC flag using a CURLOPT_SOCKOPTFUNCTION
callback.
This fixes dynamic derivations, reverting #9081.
I believe that this time around, #9052 is fixed. When I first rebased
this, tests were failing (which wasn't the case before). The cause of
those test failures were due to the crude job in which the outer goal
tried to exit with the inner goal's status.
Now, that error handling has been reworked to be more faithful. The exit
exit status and exception of the inner goal is returned by the outer
goal. The exception was what was causing the test failures, but I
believe it was not having the right error code (there is more than one
for failure) that caused #9081.
The only cost of doing things the "right way" was that I had to
introduce a hacky `preserveException` boolean. I don't like this, but,
then again, none of us like anything about how the scheduler works.
Issue #11927 is still there to clean everything up, subsuming the need
for any `preserveException` because I doubt we will be fishing
information out of state machines like this at all.
This reverts commit 8440afbed7.
Co-Authored-By: Eelco Dolstra <edolstra@gmail.com>
The main improvement is that the new message gives an example of a path
that is referenced, which should make it easier to track down. While
there, I also clarified the wording, saying exactly why the paths in
question were illegal.