mirror of
https://github.com/NixOS/nix
synced 2025-07-02 17:41:48 +02:00
Add 'download-buffer-size' setting
We are piping curl downloads into `unpackTarfileToSink()`, but the latter is typically slower than the former if you're on a fast connection. So the download could appear unnecessarily slow. (There is even a risk that if the Git import is *really* slow for whatever reason, the TCP connection could time out.) So let's make the download buffer bigger by default - 64 MiB is big enough for the Nixpkgs tarball. Perhaps in the future, we could have an unlimited buffer that spills data to disk beyond a certain threshold, but that's probably overkill.
This commit is contained in:
parent
caf4e98f0c
commit
8ffea0a018
2 changed files with 7 additions and 1 deletions
|
@ -858,7 +858,7 @@ void FileTransfer::download(
|
|||
buffer). We don't wait forever to prevent stalling the
|
||||
download thread. (Hopefully sleeping will throttle the
|
||||
sender.) */
|
||||
if (state->data.size() > 1024 * 1024) {
|
||||
if (state->data.size() > fileTransferSettings.downloadBufferSize) {
|
||||
debug("download buffer is full; going to sleep");
|
||||
state.wait_for(state->request, std::chrono::seconds(10));
|
||||
}
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue