-
-
Notifications
You must be signed in to change notification settings - Fork 306
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hydra dual writes to >= 2 locations #1367
Comments
IMO we could get away with it not being done in parallel (Hydra has far more than enough parallelism as it is to keep itself busy), what is non-negotiable is that it needs to be compressing only once, and I think this might require some fairly deep changes to Nix itself since Hydra mostly leaves Nix in charge of doing the store writes. Hopefully not... |
It's probably easier to have one box receive and multiplex the uploads |
If you want hydra itself to do this, one relatively easy way to configure this could be with a RunCommand. |
Is your feature request related to a problem? Please describe.
The write path counterpart of NixOS/infra#394.
In preparations for experimenting on alternative store backends, it is necessary to enable Hydra to write the same produced contents to multiple store targets, e.g. the primary S3 backend and another S3 in another place.
Describe the solution you'd like
(1) Hydra gets supported to copy the outputs to multiple locations, ideally in "parallel"
(2) Direct access to the secondary (and more) location provides the expected files on a local deployment (NixOS test?)
(3) Suggest a change to NixOS configuration of Hydra in this repo
Describe alternatives you've considered
Additional context
Garbage collection policies on targets are yet to be determined, if there's a background process GCing the alternative stores, the situation where Hydra tries to insert too many things can happen before GC is not aggressive enough, conversely, if Hydra is responsible to GC when it cannot push into, we are mixing GC policies inside the CI software...
The text was updated successfully, but these errors were encountered: