Compare commits

..

84 commits

Author SHA1 Message Date
Luc Perkins d35e6e72df
Merge pull request #137 from DeterminateSystems/dependabot/cargo/crossbeam-channel-0.5.15
build(deps): bump crossbeam-channel from 0.5.14 to 0.5.15
2025-04-10 12:13:17 -03:00
dependabot[bot] 6fc832cb76
build(deps): bump crossbeam-channel from 0.5.14 to 0.5.15
Bumps [crossbeam-channel](https://github.com/crossbeam-rs/crossbeam) from 0.5.14 to 0.5.15.
- [Release notes](https://github.com/crossbeam-rs/crossbeam/releases)
- [Changelog](https://github.com/crossbeam-rs/crossbeam/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crossbeam-rs/crossbeam/compare/crossbeam-channel-0.5.14...crossbeam-channel-0.5.15)

---
updated-dependencies:
- dependency-name: crossbeam-channel
  dependency-version: 0.5.15
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-10 14:45:15 +00:00
Luc Perkins 78a56de86a
Merge pull request #136 from DeterminateSystems/dependabot/cargo/tokio-1.44.2
build(deps): bump tokio from 1.44.1 to 1.44.2
2025-04-08 16:20:27 -03:00
dependabot[bot] 9fdc760dcb
build(deps): bump tokio from 1.44.1 to 1.44.2
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.44.1 to 1.44.2.
- [Release notes](https://github.com/tokio-rs/tokio/releases)
- [Commits](https://github.com/tokio-rs/tokio/compare/tokio-1.44.1...tokio-1.44.2)

---
updated-dependencies:
- dependency-name: tokio
  dependency-version: 1.44.2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-08 02:10:21 +00:00
Graham Christensen 90e95ab197
Merge pull request #134 from DeterminateSystems/update-deps
Update dependencies
2025-04-02 11:55:00 -04:00
Graham Christensen 9f88cc4842 Update dependencies 2025-04-02 11:26:06 -04:00
Luc Perkins 942b6b0ffe
Merge pull request #129 from DeterminateSystems/update-flake-lock
Update flake.lock and add update-flake-lock support
2025-03-26 16:37:30 -03:00
Luc Perkins 80600ec316
Merge remote-tracking branch 'origin/main' into update-flake-lock 2025-03-26 10:48:45 -03:00
Luc Perkins 582930b2fc
Merge pull request #128 from DeterminateSystems/flakehub-cache-action
Switch to flakehub-cache-action
2025-03-26 10:48:17 -03:00
Luc Perkins b29c2cafae
Merge remote-tracking branch 'origin/main' into flakehub-cache-action 2025-03-26 10:07:45 -03:00
Luc Perkins 0dd0d2d0a6
Update flake.lock 2025-03-26 10:07:39 -03:00
Luc Perkins 5a689bfeb3
Fix merge conflicts with main 2025-03-25 11:45:05 -03:00
Cole Mickens e9600149c7
Merge pull request #127 from DeterminateSystems/colemickens/no-auto-create-users-via-api
write github actions error when user is unauthenticated
2025-03-25 06:22:54 -07:00
Cole Mickens 9bdbfd97f2 flake.lock: Update
Flake lock file updates:

• Updated input 'crane':
    'github:ipetkov/crane/19de14aaeb869287647d9461cbd389187d8ecdb7?narHash=sha256-x4syUjNUuRblR07nDPeLDP7DpphaBVbUaSoeZkFbGSk%3D' (2025-02-19)
  → 'github:ipetkov/crane/70947c1908108c0c551ddfd73d4f750ff2ea67cd?narHash=sha256-vVOAp9ahvnU%2BfQoKd4SEXB2JG2wbENkpqcwlkIXgUC0%3D' (2025-03-19)
• Updated input 'nix':
    'https://api.flakehub.com/f/pinned/NixOS/nix/2.26.2/0194fbd7-e2ec-7193-93a9-05ae757e79a1/source.tar.gz?narHash=sha256-EOnBPe%2BydQ0/P5ZyWnFekvpyUxMcmh2rnP9yNFi/EqU%3D' (2025-02-12)
  → 'https://api.flakehub.com/f/pinned/NixOS/nix/2.27.1/0195c8c5-1964-7a31-b025-ebf9bfeef991/source.tar.gz?narHash=sha256-rBPulEBpn4IiqkPsetuh7BRzT2iGCzZYnogTAsbrvhU%3D' (2025-03-24)
• Updated input 'nixpkgs':
    'https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.1.755230%2Brev-73cf49b8ad837ade2de76f87eb53fc85ed5d4680/01951ca9-35fa-70f2-b972-630b0cd93c65/source.tar.gz?narHash=sha256-EO1ygNKZlsAC9avfcwHkKGMsmipUk1Uc0TbrEZpkn64%3D' (2025-02-18)
  → 'https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.1.770807%2Brev-a84ebe20c6bc2ecbcfb000a50776219f48d134cc/0195b626-8c1d-7fb9-9282-563af3d37ab9/source.tar.gz?narHash=sha256-mNqIplmEohk5jRkqYqG19GA8MbQ/D4gQSK0Mu4LvfRQ%3D' (2025-03-19)
2025-03-25 06:06:08 -07:00
Cole Mickens d347386c3f write github actions error when user is unauthenticated 2025-03-25 06:06:08 -07:00
Luc Perkins a774f04dfb
Switch crane input to FlakeHub 2025-03-24 20:25:46 -03:00
Luc Perkins 7ed9fc9cbb
Update flake.lock and add update-flake-lock support 2025-03-24 20:21:34 -03:00
Luc Perkins 8fa4c519ce
Switch to flakehub-cache-action 2025-03-24 20:14:03 -03:00
Graham Christensen 3a905ca44d
Merge pull request #126 from DeterminateSystems/dependabot/cargo/ring-0.17.13
build(deps): bump ring from 0.17.9 to 0.17.13
2025-03-07 10:59:11 -08:00
dependabot[bot] 339b12a07b
build(deps): bump ring from 0.17.9 to 0.17.13
Bumps [ring](https://github.com/briansmith/ring) from 0.17.9 to 0.17.13.
- [Changelog](https://github.com/briansmith/ring/blob/main/RELEASES.md)
- [Commits](https://github.com/briansmith/ring/commits)

---
updated-dependencies:
- dependency-name: ring
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-07 17:14:54 +00:00
Graham Christensen c5e897a376
Merge pull request #124 from DeterminateSystems/updates
Updates to flakes and crates
2025-02-24 15:04:11 -05:00
Cole Helbling 1eeaf990c0
Merge pull request #125 from emilazy/push-qppkqwrwsowk
Fix build with Darwin and Nix 2.26 after updates
2025-02-24 11:27:00 -08:00
Emily 66c0f4bf8e nix.patch: remove
No longer used and confused me for a second.
2025-02-24 18:42:11 +00:00
Emily f64d3e92bf readme: remove mention of Musl
This confused me a bunch so it’ll probably confuse others!
2025-02-24 18:42:11 +00:00
Emily 5e805d85e8 ci: re‐enable flake checker fail mode
Looks like this should be fine now that Nixpkgs has been updated and
the static build isn’t used any more.
2025-02-24 18:42:11 +00:00
Emily 8780a7d721 flake: use the Nix flake in the development shell too 2025-02-24 18:42:11 +00:00
Emily 90e06e4287 cargo: bump attic for Nix 2.26 2025-02-24 18:42:11 +00:00
Emily f5504c8285 flake: fix Darwin build
None of this stuff is necessary these days. This will produce an
executable linking against a Nix store `libiconv`, but as the build
isn’t static to begin with that should be fine. If a static build
is required in the future, `pkgsStatic` can be used as it is in the
`nix-installer` flake.
2025-02-19 22:25:16 +00:00
Emily a5c0301ee8 flake: use stock Nixpkgs Rust toolchain
This makes it easier to use Nixpkgs’ cross‐compilation machinery
and simplifies the flake.
2025-02-19 22:25:16 +00:00
Emily 92b7440174 flake: update crane 2025-02-19 22:24:35 +00:00
Emily ee9c0b9fa5 flake: drop flake-compat input
Not used by anything.
2025-02-19 22:24:35 +00:00
Emily a0571cc895 flake: remove default from the overlay
This belongs in `packages` instead (where a duplicate is already
present anyway).
2025-02-19 20:07:15 +00:00
Cole Helbling 260ce740fc
Update flake.nix 2025-02-19 10:32:14 -08:00
Graham Christensen 22d923e664 ...flake.nix too 2025-02-19 11:48:48 -05:00
Graham Christensen 214e869fee flake.lock: Update
Flake lock file updates:
2025-02-19 11:48:22 -05:00
Graham Christensen 36897bff90 Cargo update 2025-02-19 11:45:06 -05:00
Graham Christensen f3fe26e7c0 Swift from a nix overlay to the nix input's default package, update flakes 2025-02-19 11:43:40 -05:00
Graham Christensen 61233c3bb5 flake.lock: Update
Flake lock file updates:

• Updated input 'nix':
    'https://api.flakehub.com/f/pinned/NixOS/nix/2.22.1/018f61d9-3f9a-7ccf-9bfc-174e3a17ab38/source.tar.gz?narHash=sha256-5Q1WkpTWH7fkVfYhHDc5r0A%2BVc%2BK5xB1UhzrLzBCrB8%3D' (2024-05-09)
  → 'https://api.flakehub.com/f/pinned/NixOS/nix/2.26.2/0194fbd7-e2ec-7193-93a9-05ae757e79a1/source.tar.gz?narHash=sha256-EOnBPe%2BydQ0/P5ZyWnFekvpyUxMcmh2rnP9yNFi/EqU%3D' (2025-02-12)
• Updated input 'nix/flake-compat':
    'github:edolstra/flake-compat/35bb57c0c8d8b62bbfd284272c928ceb64ddbde9?narHash=sha256-4gtG9iQuiKITOjNQQeQIpoIB6b16fm%2B504Ch3sNKLd8%3D' (2023-01-17)
  → 'github:edolstra/flake-compat/ff81ac966bb2cae68946d5ed5fc4994f96d0ffec?narHash=sha256-NeCCThCEP3eCl2l/%2B27kNNK7QrwZB1IJCrXfrbv5oqU%3D' (2024-12-04)
• Updated input 'nix/flake-parts':
    'github:hercules-ci/flake-parts/9126214d0a59633752a136528f5f3b9aa8565b7d?narHash=sha256-sB4SWl2lX95bExY2gMFG5HIzvva5AVMJd4Igm%2BGpZNw%3D' (2024-04-01)
  → 'github:hercules-ci/flake-parts/205b12d8b7cd4802fbcb8e8ef6a0f1408781a4f9?narHash=sha256-4pDvzqnegAfRkPwO3wmwBhVi/Sye1mzps0zHWYnP88c%3D' (2024-12-04)
• Added input 'nix/git-hooks-nix':
    'github:cachix/git-hooks.nix/aa9f40c906904ebd83da78e7f328cd8aeaeae785?narHash=sha256-NdaCraHPp8iYMWzdXAt5Nv6sA3MUzlCiGiR586TCwo0%3D' (2024-12-15)
• Added input 'nix/git-hooks-nix/flake-compat':
    follows 'nix'
• Added input 'nix/git-hooks-nix/gitignore':
    follows 'nix'
• Added input 'nix/git-hooks-nix/nixpkgs':
    follows 'nix/nixpkgs'
• Added input 'nix/git-hooks-nix/nixpkgs-stable':
    follows 'nix/nixpkgs'
• Removed input 'nix/libgit2'
• Updated input 'nix/nixpkgs':
    'github:NixOS/nixpkgs/b550fe4b4776908ac2a861124307045f8e717c8e?narHash=sha256-7kkJQd4rZ%2BvFrzWu8sTRtta5D1kBG0LSRYAfhtmMlSo%3D' (2024-02-28)
  → 'github:NixOS/nixpkgs/48d12d5e70ee91fe8481378e540433a7303dbf6a?narHash=sha256-1Noao/H%2BN8nFB4Beoy8fgwrcOQLVm9o4zKW1ODaqK9E%3D' (2024-12-16)
• Added input 'nix/nixpkgs-23-11':
    'github:NixOS/nixpkgs/a62e6edd6d5e1fa0329b8653c801147986f8d446?narHash=sha256-oamiKNfr2MS6yH64rUn99mIZjc45nGJlj9eGth/3Xuw%3D' (2024-05-31)
• Removed input 'nix/pre-commit-hooks'
• Removed input 'nix/pre-commit-hooks/flake-compat'
• Removed input 'nix/pre-commit-hooks/flake-utils'
• Removed input 'nix/pre-commit-hooks/gitignore'
• Removed input 'nix/pre-commit-hooks/nixpkgs'
• Removed input 'nix/pre-commit-hooks/nixpkgs-stable'
2025-02-19 11:43:22 -05:00
Graham Christensen 1cd5d316da flake.lock: Update
Flake lock file updates:

• Updated input 'fenix':
    'https://api.flakehub.com/f/pinned/nix-community/fenix/0.1.1955%2Brev-60ab4a085ef6ee40f2ef7921ca4061084dd8cf26/01910d03-2462-7e48-b72e-439d1152bd11/source.tar.gz?narHash=sha256-l7/yMehbrL5d4AI8E2hKtNlT50BlUAau4EKTgPg9KcY%3D' (2024-08-01)
  → 'https://api.flakehub.com/f/pinned/nix-community/fenix/0.1.2156%2Brev-de3ea31eb651b663449361f77d9c1e8835290470/0194c095-0041-7b9c-b19e-cf1c4a2adaad/source.tar.gz?narHash=sha256-TC3xA%2B%2BKgprECm/WPsLUd%2Ba77MObZPElCW6eAsjVW1k%3D' (2025-02-01)
• Updated input 'fenix/rust-analyzer-src':
    'github:rust-lang/rust-analyzer/c8e41d95061543715b30880932ec3dc24c42d7ae?narHash=sha256-1na4m2PNH99syz2g/WQ%2BHr3RfY7k4H8NBnmkr5dFDXw%3D' (2024-07-31)
  → 'github:rust-lang/rust-analyzer/3c2aca1e5e9fbabb4e05fc4baa62e807aadc476a?narHash=sha256-1zhfA5NBqin0Z79Se85juvqQteq7uClJMEb7l2pdDUY%3D' (2025-01-30)
• Updated input 'flake-compat':
    'https://api.flakehub.com/f/pinned/edolstra/flake-compat/1.0.1/018afb31-abd1-7bff-a5e4-cff7e18efb7a/source.tar.gz?narHash=sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U%3D' (2023-10-04)
  → 'https://api.flakehub.com/f/pinned/edolstra/flake-compat/1.1.0/01948eb7-9cba-704f-bbf3-3fa956735b52/source.tar.gz?narHash=sha256-NeCCThCEP3eCl2l/%2B27kNNK7QrwZB1IJCrXfrbv5oqU%3D' (2024-12-04)
• Updated input 'nixpkgs':
    'https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.2311.558675%2Brev-9d29cd266cebf80234c98dd0b87256b6be0af44e/018fb680-a725-7c9d-825e-aadb0901263e/source.tar.gz?narHash=sha256-xim1b5/HZYbWaZKyI7cn9TJCM6ewNVZnesRr00mXeS4%3D' (2024-05-25)
  → 'https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.1.755230%2Brev-73cf49b8ad837ade2de76f87eb53fc85ed5d4680/01951ca9-35fa-70f2-b972-630b0cd93c65/source.tar.gz?narHash=sha256-EO1ygNKZlsAC9avfcwHkKGMsmipUk1Uc0TbrEZpkn64%3D' (2025-02-18)
2025-02-19 11:37:25 -05:00
Cole Helbling c12f7ed42f
Merge pull request #121 from maxheld83/patch-1
notify users about EOL of magic nix cache
2025-02-18 12:49:59 -08:00
Cole Helbling 7099e801ae fixup: README link 2025-02-18 12:05:25 -08:00
Cole Helbling b1a16d5ac2
Merge pull request #122 from DeterminateSystems/fh-583-shorten-file-changed-check
Check if token changed every 3 seconds instead
2025-02-10 07:43:16 -08:00
Cole Helbling 1fad1473c5 Check if token changed every 3 seconds instead 2025-02-10 07:28:34 -08:00
Max Held 8492a69263
notify users about EOL
closes https://github.com/DeterminateSystems/magic-nix-cache/issues/119
2025-01-29 18:44:28 +01:00
Graham Christensen a65ff39edd
Merge pull request #118 from DeterminateSystems/grahamc-patch-2
drop 429 notice
2025-01-16 18:48:03 -05:00
Graham Christensen ef4bd6fb91
drop 429 notice 2025-01-16 18:34:39 -05:00
Cole Helbling 4d32a8ef4e
Merge pull request #117 from DeterminateSystems/fixup-ci
fixup: bump forgotten download-artifact action version
2025-01-16 09:24:02 -08:00
Cole Helbling e8e38ad4fc fixup: bump forgotten download-artifact action version 2025-01-16 09:10:05 -08:00
Cole Helbling e70bb1e416
Merge pull request #116 from DeterminateSystems/cole/fh-543
Suggest FlakeHub Cache when hit by 429
2025-01-16 09:00:06 -08:00
Graham Christensen 4e2b37be36 Set a metric field for when GHA 429's 2025-01-16 08:46:41 -08:00
Cole Helbling 322a99d45e fixup: update upload-artifact, download-artifact action versions 2025-01-16 08:46:41 -08:00
Cole Helbling 003f106338
Update GHA 429 notice wording
Co-authored-by: Graham Christensen <graham@grahamc.com>
2025-01-16 07:32:20 -08:00
Cole Helbling b9f89bd546 Suggest FlakeHub Cache when hit by 429 2025-01-14 13:56:17 -08:00
Cole Helbling 215dc0d8e9
Merge pull request #114 from DeterminateSystems/cole/fh-485
Fixup auth when using determinate-nixd with long-running builds
2024-12-04 13:24:01 -08:00
Cole Helbling 2bac50c0ca Move "workaround" notes closer to the workaround 2024-12-04 13:11:19 -08:00
Cole Helbling 5e7acea3d1 Fixup auth when using determinate-nixd with long-running builds 2024-12-04 13:11:19 -08:00
Cole Helbling 11b78639f1 Cargo.lock: update attic dependencies 2024-12-04 12:44:32 -08:00
Cole Helbling 296e9dc1af
Merge pull request #111 from DeterminateSystems/grahamc-patch-2
Use magic-nix-cache-action@main, oops
2024-11-08 09:48:20 -08:00
Graham Christensen f27f314206
Use magic-nix-cache-action@main, oops 2024-11-08 12:29:41 -05:00
Graham Christensen 448d84e32f
Merge pull request #110 from DeterminateSystems/graham/fh-433-magic-nix-cache-should-disable-github-actions-cache-if
Graham/fh 433 magic nix cache should disable GitHub actions cache when flakehub cache is enabled
2024-11-06 12:05:23 -05:00
Graham Christensen d1983bbdff Don't try to use the netrc if itdoesn't exist 2024-11-06 09:47:44 -05:00
Graham Christensen a68e1c4d54 Test the patched action 2024-11-05 22:27:17 -05:00
Graham Christensen bf844027bc Turn off the GitHub actions cache if the user expresses no preference, and flakehub cache is in use 2024-11-05 21:22:38 -05:00
Graham Christensen 65060bc705 Switch the GHA Caceh preference to a trinary, but treat it as a straight bool for the moment 2024-11-05 21:19:18 -05:00
Graham Christensen 3fd6eeb208 Make the FlakeHubArg a generic Trinary so we can use it for GHA Cache too 2024-11-05 21:11:26 -05:00
Graham Christensen cf183317a5
Merge pull request #109 from DeterminateSystems/colemickens/shutdown
shutdown: wait for flakehub_cache first
2024-11-05 19:45:48 -05:00
Graham Christensen 647b207575
Update magic-nix-cache/src/api.rs 2024-11-05 19:33:11 -05:00
Cole Mickens 625d7717b6 flakehub logging review comments 2024-11-05 15:08:00 -08:00
Graham Christensen 925be77ec2
Merge pull request #108 from mightyiam/bump-checkout-action
Bump GitHub action actions/checkout from v3 to v4
2024-11-05 18:02:55 -05:00
Cole Mickens 7841b8bbe2 flakehub cache init failure is an error 2024-11-05 15:00:56 -08:00
Cole Mickens a54a97ff9b shutdown: info! print paths after FHC upload 2024-11-05 15:00:56 -08:00
Cole Mickens 9f7a4abc4d shutdown: info! if we don't have flakehub_state at workflow_finish 2024-11-05 14:24:56 -08:00
Cole Mickens 799a0c42e6 shutdown: wait for flakehub_cache first 2024-11-05 13:55:36 -08:00
Shahar "Dawn" Or 65899a5ad5 Bump GitHub action actions/checkout from v3 to v4
The breaking change seems to be something about the default Node.js
runtime: https://github.com/actions/checkout/blob/main/CHANGELOG.md#v400

Also in example in documentation, for the sake of the copy-pasting user
(me).
2024-11-05 21:48:35 +07:00
Graham Christensen 6a5abbf3bb
Merge pull request #105 from cfsnyder/main
Fix compatibility issues with alternative GHA cache implementation
2024-09-25 15:17:09 -04:00
Graham Christensen 24af143b67
Don't fail if flakehub cache wasn't requested and its requirements weren't present (#107)
* Treat the use_flakehub flag as an enum to avoid boolean blindness

* Make the match statement around flakehub cache considerate of users who did not opt in to it

* Update magic-nix-cache/src/main.rs
2024-09-25 19:16:43 +00:00
Cory Snyder 4f25f7b3e6 Fix compatibility issues with alternative GHA cache implementation
Fixes two compatibility issues with the alternative GHA cache server
implementation:

https://github.com/falcondev-oss/github-actions-cache-server

1. This implementation does not support redundant forward slashes
   in URL paths. The change allows magic-nix-cache to work properly
   regardless of whether ACTIONS_CACHE_URL ends in a forward slash or
   not.
2. The cache IDs returned by this implementation can be too big for
   an i32, so the representation of the CacheID type has been updated
   to an i64.

Signed-off-by: Cory Snyder <csnyder@1111systems.com>
2024-09-20 05:12:26 -04:00
Graham Christensen 955ed68d34
Merge pull request #104 from DeterminateSystems/use-14-large
Update our intel macs to 14-large
2024-09-17 17:46:15 -04:00
Graham Christensen 7c6bd9387c
Merge pull request #103 from DeterminateSystems/fh-cache-under-determinate
Drop the assertion around the netrc under dnixd
2024-09-17 17:45:25 -04:00
Graham Christensen 6acb043852 Update our intel macs to 14-large 2024-09-17 17:28:15 -04:00
Graham Christensen 04af54090e Notify the user with an info if we're ignoring their netrc 2024-09-17 17:23:44 -04:00
Graham Christensen bc76dfa4df Clean up the netrc handling when dnixd is around 2024-09-17 17:17:11 -04:00
Cole Mickens 5b126b691b
Merge pull request #102 from DeterminateSystems/colemickens/fallback
nix.conf: move write for 'fallback', always set it
2024-09-16 12:22:57 -05:00
Cole Mickens 2bcd86656f nix.conf: move write for 'fallback', always set it 2024-09-16 10:08:16 -07:00
23 changed files with 2444 additions and 1634 deletions

View file

@ -1,3 +0,0 @@
# For -Zbuild-std
[target.aarch64-unknown-linux-musl]
rustflags = ["-C", "target-feature=+crt-static", "-C", "link-arg=-lgcc"]

View file

@ -24,16 +24,16 @@ jobs:
runner: namespace-profile-default-arm64
- nix-system: x86_64-darwin
system: X64-macOS
runner: macos-12
runner: macos-14-large
- nix-system: aarch64-darwin
system: ARM64-macOS
runner: macos-latest-xlarge
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install Nix on ${{ matrix.systems.system }}
uses: DeterminateSystems/nix-installer-action@main
- name: Magic Nix Cache
uses: DeterminateSystems/magic-nix-cache-action@main
- name: Set up FlakeHub Cache
uses: DeterminateSystems/flakehub-cache-action@main
- name: Build and cache dev shell for ${{ matrix.systems.nix-system }}
run: |
@ -45,7 +45,7 @@ jobs:
nix-store --export $(nix-store -qR ./result) | xz -9 > "${{ env.ARCHIVE_NAME }}"
- name: Upload magic-nix-cache closure for ${{ matrix.systems.system }}
uses: actions/upload-artifact@v3.1.2
uses: actions/upload-artifact@v4.6.0
with:
# Artifact name
name: ${{ env.ARTIFACT_KEY }}

View file

@ -13,18 +13,17 @@ jobs:
contents: read
id-token: write
steps:
- uses: actions/checkout@v3
- name: Install Nix
uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- uses: actions/checkout@v4
- name: Check health of flake.lock
uses: DeterminateSystems/flake-checker-action@main
# TODO: re-enable fail mode when we find a way to bump Nixpkgs to 24.05
# without breaking the static Rust build
#with:
# fail-mode: true
with:
fail-mode: true
- name: Install Nix
uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/flakehub-cache-action@main
- name: Check Rust formatting
run: nix develop --command cargo fmt --check
@ -53,30 +52,23 @@ jobs:
- system: ARM64-Linux
runner: namespace-profile-default-arm64
- system: X64-macOS
runner: macos-12
runner: macos-14-large
- system: ARM64-macOS
runner: macos-latest-xlarge
extra_nix_installer_args:
- "--determinate"
- ""
permissions:
contents: read
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Download closure for ${{ matrix.systems.system }}
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4.1.8
with:
name: ${{ env.ARTIFACT_KEY }}
path: ${{ env.ARTIFACT_KEY }}
- name: Install Nix on ${{ matrix.systems.system }}
uses: DeterminateSystems/nix-installer-action@main
with:
source-pr: 1163
extra-args: ${{ matrix.extra_nix_installer_args }}
- run: determinate-nixd login github-action
- name: Test magic-nix-cache-action@main on ${{ matrix.systems.runner }}
uses: DeterminateSystems/magic-nix-cache-action@main

View file

@ -12,7 +12,7 @@ jobs:
id-token: "write"
contents: "read"
steps:
- uses: "actions/checkout@v3"
- uses: "actions/checkout@v4"
- uses: "DeterminateSystems/nix-installer-action@main"
- uses: "DeterminateSystems/flakehub-push@main"
with:

View file

@ -5,10 +5,10 @@ jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install Nix
uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- uses: DeterminateSystems/flakehub-cache-action@main
- name: Expose GitHub Runtime
uses: crazy-max/ghaction-github-runtime@v2
- name: Dump credentials

View file

@ -22,7 +22,7 @@ jobs:
id-token: write # In order to request a JWT for AWS auth
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
@ -32,28 +32,28 @@ jobs:
- name: Create the artifacts directory
run: rm -rf ./artifacts && mkdir ./artifacts
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-macOS
path: cache-binary-ARM64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-ARM64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-ARM64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-macOS
path: cache-binary-X64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-X64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-Linux
path: cache-binary-X64-Linux
- name: Persist the cache binary
run: cp ./cache-binary-X64-Linux/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-Linux
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-Linux
path: cache-binary-ARM64-Linux

View file

@ -31,33 +31,33 @@ jobs:
contents: read
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Create the artifacts directory
run: rm -rf ./artifacts && mkdir ./artifacts
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-macOS
path: cache-binary-ARM64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-ARM64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-ARM64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-macOS
path: cache-binary-X64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-X64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-Linux
path: cache-binary-X64-Linux
- name: Persist the cache binary
run: cp ./cache-binary-X64-Linux/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-Linux
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-Linux
path: cache-binary-ARM64-Linux

View file

@ -19,33 +19,33 @@ jobs:
id-token: write # In order to request a JWT for AWS auth
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Create the artifacts directory
run: rm -rf ./artifacts && mkdir ./artifacts
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-macOS
path: cache-binary-ARM64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-ARM64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-ARM64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-macOS
path: cache-binary-X64-macOS
- name: Persist the cache binary
run: cp ./cache-binary-X64-macOS/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-macOS
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-X64-Linux
path: cache-binary-X64-Linux
- name: Persist the cache binary
run: cp ./cache-binary-X64-Linux/magic-nix-cache.closure.xz ./artifacts/magic-nix-cache-X64-Linux
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4.1.8
with:
name: magic-nix-cache-ARM64-Linux
path: cache-binary-ARM64-Linux

View file

@ -0,0 +1,20 @@
name: update-flake-lock
on:
workflow_dispatch: # enable manual triggering
schedule:
- cron: "0 0 * * 0" # every Sunday at midnight
jobs:
lockfile:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/flakehub-cache-action@main
- uses: DeterminateSystems/update-flake-lock@main
with:
pr-title: Update flake.lock
pr-labels: |
dependencies
automated

3049
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,5 +1,12 @@
# Magic Nix Cache
> [!WARNING]
> The [Magic Nix Cache will will stop working](https://determinate.systems/posts/magic-nix-cache-free-tier-eol) on **February 1st, 2025** unless you're on [GitHub Enterprise Server](https://github.com/enterprise).
>
> You can upgrade to [FlakeHub Cache](https://flakehub.com/cache) and get **one month free** using the coupon code **`FHC`**.
>
> For more information, read [this blog post](https://determinate.systems/posts/magic-nix-cache-free-tier-eol/).
Save 30-50%+ of CI time without any effort or cost.
Use Magic Nix Cache, a totally free and zero-configuration binary cache for Nix on GitHub Actions.
@ -10,7 +17,7 @@ permissions:
contents: read
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- run: nix flake check
@ -52,7 +59,7 @@ jobs:
contents: read
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- run: nix flake check
@ -84,8 +91,8 @@ For local development, see `gha-cache/README.md` for more details on how to obta
```shell
cargo run -- -c creds.json --upstream https://cache.nixos.org
cargo build --release --target x86_64-unknown-linux-musl
cargo build --release --target aarch64-unknown-linux-musl
cargo build --release --target x86_64-unknown-linux-gnu
cargo build --release --target aarch64-unknown-linux-gnu
nix copy --to 'http://127.0.0.1:3000' $(which bash)
nix-store --store $PWD/test-root --extra-substituters 'http://localhost:3000' --option require-sigs false -r $(which bash)
```

View file

@ -1,66 +1,27 @@
{
"nodes": {
"crane": {
"inputs": {
"nixpkgs": [
"nixpkgs"
]
},
"locked": {
"lastModified": 1714842444,
"narHash": "sha256-z4HeSYtEdYxKurrbxCMb8v/I1LYDHR/aFrZtGtgUgHw=",
"rev": "c5ee4371eea1728ef04bb09c79577c84d5e67a48",
"revCount": 557,
"lastModified": 1741479724,
"narHash": "sha256-fnyETBKSVRa5abjOiRG/IAzKZq5yX8U6oRrHstPl4VM=",
"rev": "60202a2e3597a3d91f5e791aab03f45470a738b5",
"revCount": 709,
"type": "tarball",
"url": "https://api.flakehub.com/f/pinned/ipetkov/crane/0.16.6/018f4495-627e-7385-b537-81f1c1d4003b/source.tar.gz"
"url": "https://api.flakehub.com/f/pinned/ipetkov/crane/0.20.2/0195784b-915b-7d2d-915d-ab02d1112ef9/source.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://flakehub.com/f/ipetkov/crane/0.16.3.tar.gz"
}
},
"fenix": {
"inputs": {
"nixpkgs": [
"nixpkgs"
],
"rust-analyzer-src": "rust-analyzer-src"
},
"locked": {
"lastModified": 1722493751,
"narHash": "sha256-l7/yMehbrL5d4AI8E2hKtNlT50BlUAau4EKTgPg9KcY=",
"rev": "60ab4a085ef6ee40f2ef7921ca4061084dd8cf26",
"revCount": 1955,
"type": "tarball",
"url": "https://api.flakehub.com/f/pinned/nix-community/fenix/0.1.1955%2Brev-60ab4a085ef6ee40f2ef7921ca4061084dd8cf26/01910d03-2462-7e48-b72e-439d1152bd11/source.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://flakehub.com/f/nix-community/fenix/0.1.1727.tar.gz"
"url": "https://flakehub.com/f/ipetkov/crane/%2A"
}
},
"flake-compat": {
"locked": {
"lastModified": 1696426674,
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
"revCount": 57,
"type": "tarball",
"url": "https://api.flakehub.com/f/pinned/edolstra/flake-compat/1.0.1/018afb31-abd1-7bff-a5e4-cff7e18efb7a/source.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://flakehub.com/f/edolstra/flake-compat/1.0.1.tar.gz"
}
},
"flake-compat_2": {
"flake": false,
"locked": {
"lastModified": 1673956053,
"narHash": "sha256-4gtG9iQuiKITOjNQQeQIpoIB6b16fm+504Ch3sNKLd8=",
"lastModified": 1733328505,
"narHash": "sha256-NeCCThCEP3eCl2l/+27kNNK7QrwZB1IJCrXfrbv5oqU=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "35bb57c0c8d8b62bbfd284272c928ceb64ddbde9",
"rev": "ff81ac966bb2cae68946d5ed5fc4994f96d0ffec",
"type": "github"
},
"original": {
@ -77,11 +38,11 @@
]
},
"locked": {
"lastModified": 1712014858,
"narHash": "sha256-sB4SWl2lX95bExY2gMFG5HIzvva5AVMJd4Igm+GpZNw=",
"lastModified": 1733312601,
"narHash": "sha256-4pDvzqnegAfRkPwO3wmwBhVi/Sye1mzps0zHWYnP88c=",
"owner": "hercules-ci",
"repo": "flake-parts",
"rev": "9126214d0a59633752a136528f5f3b9aa8565b7d",
"rev": "205b12d8b7cd4802fbcb8e8ef6a0f1408781a4f9",
"type": "github"
},
"original": {
@ -90,75 +51,91 @@
"type": "github"
}
},
"flake-utils": {
"git-hooks-nix": {
"inputs": {
"flake-compat": [
"nix"
],
"gitignore": [
"nix"
],
"nixpkgs": [
"nix",
"nixpkgs"
],
"nixpkgs-stable": [
"nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1667395993,
"narHash": "sha256-nuEHfE/LcWyuSWnS8t12N1wc105Qtau+/OdUAjtQ0rA=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "5aed5285a952e0b949eb3ba02c12fa4fcfef535f",
"lastModified": 1734279981,
"narHash": "sha256-NdaCraHPp8iYMWzdXAt5Nv6sA3MUzlCiGiR586TCwo0=",
"owner": "cachix",
"repo": "git-hooks.nix",
"rev": "aa9f40c906904ebd83da78e7f328cd8aeaeae785",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"libgit2": {
"flake": false,
"locked": {
"lastModified": 1697646580,
"narHash": "sha256-oX4Z3S9WtJlwvj0uH9HlYcWv+x1hqp8mhXl7HsLu2f0=",
"owner": "libgit2",
"repo": "libgit2",
"rev": "45fd9ed7ae1a9b74b957ef4f337bc3c8b3df01b5",
"type": "github"
},
"original": {
"owner": "libgit2",
"repo": "libgit2",
"owner": "cachix",
"repo": "git-hooks.nix",
"type": "github"
}
},
"nix": {
"inputs": {
"flake-compat": "flake-compat_2",
"flake-compat": "flake-compat",
"flake-parts": "flake-parts",
"libgit2": "libgit2",
"git-hooks-nix": "git-hooks-nix",
"nixpkgs": "nixpkgs",
"nixpkgs-regression": "nixpkgs-regression",
"pre-commit-hooks": "pre-commit-hooks"
"nixpkgs-23-11": "nixpkgs-23-11",
"nixpkgs-regression": "nixpkgs-regression"
},
"locked": {
"lastModified": 1715246928,
"narHash": "sha256-5Q1WkpTWH7fkVfYhHDc5r0A+Vc+K5xB1UhzrLzBCrB8=",
"rev": "adba2f19a02eaa74336a06a026d3c37af8020559",
"revCount": 17044,
"lastModified": 1742824067,
"narHash": "sha256-rBPulEBpn4IiqkPsetuh7BRzT2iGCzZYnogTAsbrvhU=",
"rev": "9cb662df7442a1e2c4600fb8ecb2ad613ebc5a95",
"revCount": 19496,
"type": "tarball",
"url": "https://api.flakehub.com/f/pinned/NixOS/nix/2.22.1/018f61d9-3f9a-7ccf-9bfc-174e3a17ab38/source.tar.gz"
"url": "https://api.flakehub.com/f/pinned/NixOS/nix/2.27.1/0195c8c5-1964-7a31-b025-ebf9bfeef991/source.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://flakehub.com/f/NixOS/nix/%3D2.22.1.tar.gz"
"url": "https://flakehub.com/f/NixOS/nix/2"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1709083642,
"narHash": "sha256-7kkJQd4rZ+vFrzWu8sTRtta5D1kBG0LSRYAfhtmMlSo=",
"lastModified": 1734359947,
"narHash": "sha256-1Noao/H+N8nFB4Beoy8fgwrcOQLVm9o4zKW1ODaqK9E=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "b550fe4b4776908ac2a861124307045f8e717c8e",
"rev": "48d12d5e70ee91fe8481378e540433a7303dbf6a",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "release-23.11",
"ref": "release-24.11",
"repo": "nixpkgs",
"type": "github"
}
},
"nixpkgs-23-11": {
"locked": {
"lastModified": 1717159533,
"narHash": "sha256-oamiKNfr2MS6yH64rUn99mIZjc45nGJlj9eGth/3Xuw=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "a62e6edd6d5e1fa0329b8653c801147986f8d446",
"type": "github"
},
"original": {
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "a62e6edd6d5e1fa0329b8653c801147986f8d446",
"type": "github"
}
},
"nixpkgs-regression": {
"locked": {
"lastModified": 1643052045,
@ -177,75 +154,24 @@
},
"nixpkgs_2": {
"locked": {
"lastModified": 1716633019,
"narHash": "sha256-xim1b5/HZYbWaZKyI7cn9TJCM6ewNVZnesRr00mXeS4=",
"rev": "9d29cd266cebf80234c98dd0b87256b6be0af44e",
"revCount": 558675,
"lastModified": 1742422364,
"narHash": "sha256-mNqIplmEohk5jRkqYqG19GA8MbQ/D4gQSK0Mu4LvfRQ=",
"rev": "a84ebe20c6bc2ecbcfb000a50776219f48d134cc",
"revCount": 770807,
"type": "tarball",
"url": "https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.2311.558675%2Brev-9d29cd266cebf80234c98dd0b87256b6be0af44e/018fb680-a725-7c9d-825e-aadb0901263e/source.tar.gz"
"url": "https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.1.770807%2Brev-a84ebe20c6bc2ecbcfb000a50776219f48d134cc/0195b626-8c1d-7fb9-9282-563af3d37ab9/source.tar.gz"
},
"original": {
"type": "tarball",
"url": "https://flakehub.com/f/NixOS/nixpkgs/0.2311.tar.gz"
}
},
"pre-commit-hooks": {
"inputs": {
"flake-compat": [
"nix"
],
"flake-utils": "flake-utils",
"gitignore": [
"nix"
],
"nixpkgs": [
"nix",
"nixpkgs"
],
"nixpkgs-stable": [
"nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1712897695,
"narHash": "sha256-nMirxrGteNAl9sWiOhoN5tIHyjBbVi5e2tgZUgZlK3Y=",
"owner": "cachix",
"repo": "pre-commit-hooks.nix",
"rev": "40e6053ecb65fcbf12863338a6dcefb3f55f1bf8",
"type": "github"
},
"original": {
"owner": "cachix",
"repo": "pre-commit-hooks.nix",
"type": "github"
"url": "https://flakehub.com/f/NixOS/nixpkgs/0.1"
}
},
"root": {
"inputs": {
"crane": "crane",
"fenix": "fenix",
"flake-compat": "flake-compat",
"nix": "nix",
"nixpkgs": "nixpkgs_2"
}
},
"rust-analyzer-src": {
"flake": false,
"locked": {
"lastModified": 1722449213,
"narHash": "sha256-1na4m2PNH99syz2g/WQ+Hr3RfY7k4H8NBnmkr5dFDXw=",
"owner": "rust-lang",
"repo": "rust-analyzer",
"rev": "c8e41d95061543715b30880932ec3dc24c42d7ae",
"type": "github"
},
"original": {
"owner": "rust-lang",
"ref": "nightly",
"repo": "rust-analyzer",
"type": "github"
}
}
},
"root": "root",

View file

@ -2,24 +2,14 @@
description = "GitHub Actions-powered Nix binary cache";
inputs = {
nixpkgs.url = "https://flakehub.com/f/NixOS/nixpkgs/0.2311.tar.gz";
nixpkgs.url = "https://flakehub.com/f/NixOS/nixpkgs/0.1";
fenix = {
url = "https://flakehub.com/f/nix-community/fenix/0.1.1727.tar.gz";
inputs.nixpkgs.follows = "nixpkgs";
};
crane.url = "https://flakehub.com/f/ipetkov/crane/*";
crane = {
url = "https://flakehub.com/f/ipetkov/crane/0.16.3.tar.gz";
inputs.nixpkgs.follows = "nixpkgs";
};
flake-compat.url = "https://flakehub.com/f/edolstra/flake-compat/1.0.1.tar.gz";
nix.url = "https://flakehub.com/f/NixOS/nix/=2.22.1.tar.gz";
nix.url = "https://flakehub.com/f/NixOS/nix/2";
};
outputs = { self, nixpkgs, fenix, crane, ... }@inputs:
outputs = inputs:
let
supportedSystems = [
"aarch64-linux"
@ -28,69 +18,45 @@
"x86_64-darwin"
];
forEachSupportedSystem = f: nixpkgs.lib.genAttrs supportedSystems (system: f rec {
pkgs = import nixpkgs {
forEachSupportedSystem = f: inputs.nixpkgs.lib.genAttrs supportedSystems (system: f rec {
pkgs = import inputs.nixpkgs {
inherit system;
overlays = [
inputs.nix.overlays.default
self.overlays.default
inputs.self.overlays.default
];
};
inherit (pkgs) lib;
inherit system;
});
fenixToolchain = system: with fenix.packages.${system};
combine ([
stable.clippy
stable.rustc
stable.cargo
stable.rustfmt
stable.rust-src
stable.rust-analyzer
] ++ nixpkgs.lib.optionals (system == "x86_64-linux") [
targets.x86_64-unknown-linux-musl.stable.rust-std
] ++ nixpkgs.lib.optionals (system == "aarch64-linux") [
targets.aarch64-unknown-linux-musl.stable.rust-std
]);
in
{
overlays.default = final: prev:
let
toolchain = fenixToolchain final.hostPlatform.system;
craneLib = (crane.mkLib final).overrideToolchain toolchain;
craneLib = inputs.crane.mkLib final;
crateName = craneLib.crateNameFromCargoToml {
cargoToml = ./magic-nix-cache/Cargo.toml;
};
commonArgs = {
inherit (crateName) pname version;
src = self;
src = inputs.self;
nativeBuildInputs = with final; [
pkg-config
];
buildInputs = [
final.nix
inputs.nix.packages.${final.stdenv.system}.default
final.boost
] ++ final.lib.optionals final.stdenv.isDarwin [
final.darwin.apple_sdk.frameworks.SystemConfiguration
(final.libiconv.override { enableStatic = true; enableShared = false; })
];
NIX_CFLAGS_LINK = final.lib.optionalString final.stdenv.isDarwin "-lc++abi";
};
cargoArtifacts = craneLib.buildDepsOnly commonArgs;
in
rec {
{
magic-nix-cache = craneLib.buildPackage (commonArgs // {
inherit cargoArtifacts;
});
default = magic-nix-cache;
};
packages = forEachSupportedSystem ({ pkgs, ... }: rec {
@ -128,16 +94,16 @@
createChain 200 startFile;
});
devShells = forEachSupportedSystem ({ system, pkgs, lib }:
let
toolchain = fenixToolchain system;
in
{
devShells = forEachSupportedSystem ({ system, pkgs }: {
default = pkgs.mkShell {
packages = with pkgs; [
toolchain
rustc
cargo
clippy
rustfmt
rust-analyzer
nix # for linking attic
inputs.nix.packages.${stdenv.system}.default # for linking attic
boost # for linking attic
bashInteractive
pkg-config
@ -149,13 +115,9 @@
bacon
age
] ++ lib.optionals pkgs.stdenv.isDarwin [
libiconv
darwin.apple_sdk.frameworks.SystemConfiguration
];
NIX_CFLAGS_LINK = lib.optionalString pkgs.stdenv.isDarwin "-lc++abi";
RUST_SRC_PATH = "${toolchain}/lib/rustlib/src/rust/library";
RUST_SRC_PATH = "${pkgs.rustPlatform.rustcSrc}/library";
};
});
};

View file

@ -16,7 +16,7 @@ serde = { version = "1.0.162", default-features = false, features = ["derive"] }
serde_json = { version = "1.0.96", default-features = false }
sha2 = { version = "0.10.6", default-features = false }
thiserror = "1.0.40"
tokio = { version = "1.28.0", default-features = false, features = ["io-util"] }
tokio = { version = "1.44.2", default-features = false, features = ["io-util"] }
tracing = { version = "0.1.37", default-features = false }
unicode-bom = "2.0.2"

View file

@ -48,6 +48,8 @@ const MAX_CONCURRENCY: usize = 4;
type Result<T> = std::result::Result<T, Error>;
pub type CircuitBreakerTrippedCallback = Arc<Box<dyn Fn() + Send + Sync>>;
/// An API error.
#[derive(Error, Debug)]
pub enum Error {
@ -82,7 +84,6 @@ pub enum Error {
TooManyCollisions,
}
#[derive(Debug)]
pub struct Api {
/// Credentials to access the cache.
credentials: Credentials,
@ -104,6 +105,8 @@ pub struct Api {
circuit_breaker_429_tripped: Arc<AtomicBool>,
circuit_breaker_429_tripped_callback: CircuitBreakerTrippedCallback,
/// Backend request statistics.
#[cfg(debug_assertions)]
stats: RequestStats,
@ -116,7 +119,7 @@ pub struct FileAllocation(CacheId);
/// The ID of a cache.
#[derive(Debug, Clone, Copy, Serialize, Deserialize)]
#[serde(transparent)]
struct CacheId(pub i32);
struct CacheId(pub i64);
/// An API error.
#[derive(Debug, Clone)]
@ -242,7 +245,10 @@ impl fmt::Display for ApiErrorInfo {
}
impl Api {
pub fn new(credentials: Credentials) -> Result<Self> {
pub fn new(
credentials: Credentials,
circuit_breaker_429_tripped_callback: CircuitBreakerTrippedCallback,
) -> Result<Self> {
let mut headers = HeaderMap::new();
let auth_header = {
let mut h = HeaderValue::from_str(&format!("Bearer {}", credentials.runtime_token))
@ -273,6 +279,7 @@ impl Api {
client,
concurrency_limit: Arc::new(Semaphore::new(MAX_CONCURRENCY)),
circuit_breaker_429_tripped: Arc::new(AtomicBool::from(false)),
circuit_breaker_429_tripped_callback,
#[cfg(debug_assertions)]
stats: Default::default(),
})
@ -366,6 +373,8 @@ impl Api {
let client = self.client.clone();
let concurrency_limit = self.concurrency_limit.clone();
let circuit_breaker_429_tripped = self.circuit_breaker_429_tripped.clone();
let circuit_breaker_429_tripped_callback =
self.circuit_breaker_429_tripped_callback.clone();
let url = self.construct_url(&format!("caches/{}", allocation.0 .0));
tokio::task::spawn(async move {
@ -402,7 +411,8 @@ impl Api {
drop(permit);
circuit_breaker_429_tripped.check_result(&r);
circuit_breaker_429_tripped
.check_result(&r, &circuit_breaker_429_tripped_callback);
r
})
@ -465,7 +475,8 @@ impl Api {
.check_json()
.await;
self.circuit_breaker_429_tripped.check_result(&res);
self.circuit_breaker_429_tripped
.check_result(&res, &self.circuit_breaker_429_tripped_callback);
match res {
Ok(entry) => Ok(Some(entry)),
@ -508,7 +519,8 @@ impl Api {
.check_json()
.await;
self.circuit_breaker_429_tripped.check_result(&res);
self.circuit_breaker_429_tripped
.check_result(&res, &self.circuit_breaker_429_tripped_callback);
res
}
@ -535,7 +547,8 @@ impl Api {
.check()
.await
{
self.circuit_breaker_429_tripped.check_err(&e);
self.circuit_breaker_429_tripped
.check_err(&e, &self.circuit_breaker_429_tripped_callback);
return Err(e);
}
@ -543,10 +556,13 @@ impl Api {
}
fn construct_url(&self, resource: &str) -> String {
format!(
"{}/_apis/artifactcache/{}",
self.credentials.cache_url, resource
)
let mut url = self.credentials.cache_url.clone();
if !url.ends_with('/') {
url.push('/');
}
url.push_str("_apis/artifactcache/");
url.push_str(resource);
url
}
}
@ -607,25 +623,34 @@ async fn handle_error(res: reqwest::Response) -> Error {
}
trait AtomicCircuitBreaker {
fn check_err(&self, e: &Error);
fn check_result<T>(&self, r: &std::result::Result<T, Error>);
fn check_err(&self, e: &Error, callback: &CircuitBreakerTrippedCallback);
fn check_result<T>(
&self,
r: &std::result::Result<T, Error>,
callback: &CircuitBreakerTrippedCallback,
);
}
impl AtomicCircuitBreaker for AtomicBool {
fn check_result<T>(&self, r: &std::result::Result<T, Error>) {
fn check_result<T>(
&self,
r: &std::result::Result<T, Error>,
callback: &CircuitBreakerTrippedCallback,
) {
if let Err(ref e) = r {
self.check_err(e)
self.check_err(e, callback)
}
}
fn check_err(&self, e: &Error) {
fn check_err(&self, e: &Error, callback: &CircuitBreakerTrippedCallback) {
if let Error::ApiError {
status: reqwest::StatusCode::TOO_MANY_REQUESTS,
info: ref _info,
..
} = e
{
tracing::info!("Disabling GitHub Actions Cache due to 429: Too Many Requests");
self.store(true, Ordering::Relaxed);
callback();
}
}
}

View file

@ -60,6 +60,6 @@ hyper-util = { version = "0.1", features = ["tokio", "server-auto", "http1"] }
xdg = { version = "2.5.2" }
[dependencies.tokio]
version = "1.28.0"
version = "1.44.2"
default-features = false
features = ["fs", "macros", "process", "rt", "rt-multi-thread", "sync"]

View file

@ -99,17 +99,23 @@ async fn workflow_finish(
gha_cache.shutdown().await?;
}
if let Some(attic_state) = state.flakehub_state.write().await.take() {
tracing::info!("Waiting for FlakeHub cache uploads to finish");
let paths = attic_state.push_session.wait().await?;
let paths = paths.keys().map(|s| s.name()).collect::<Vec<_>>();
tracing::info!(?paths, "FlakeHub Cache uploads completed");
} else {
tracing::info!("FlakeHub cache is not enabled, not uploading anything to it");
}
if let Some(sender) = state.shutdown_sender.lock().await.take() {
sender
.send(())
.map_err(|_| Error::Internal("Sending shutdown server message".to_owned()))?;
}
if let Some(attic_state) = state.flakehub_state.write().await.take() {
tracing::info!("Waiting for FlakeHub cache uploads to finish");
let _paths = attic_state.push_session.wait().await?;
}
// NOTE(cole-h): see `init_logging`
if let Some(logfile) = &state.logfile {
let logfile_contents = std::fs::read_to_string(logfile)

View file

@ -1,5 +1,6 @@
use crate::env::Environment;
use crate::error::{self, Error, Result};
use crate::error::{Error, Result};
use crate::DETERMINATE_NETRC_PATH;
use anyhow::Context;
use attic::cache::CacheName;
use attic::nix_store::{NixStore, StorePath};
@ -13,7 +14,8 @@ use attic_client::{
use reqwest::header::HeaderValue;
use reqwest::Url;
use serde::Deserialize;
use std::path::Path;
use std::os::unix::fs::MetadataExt;
use std::path::{Path, PathBuf};
use std::sync::Arc;
use tokio::fs::File;
use tokio::io::{AsyncReadExt, AsyncWriteExt};
@ -32,74 +34,21 @@ pub struct State {
pub async fn init_cache(
environment: Environment,
flakehub_api_server: &Url,
flakehub_api_server_netrc: &Path,
flakehub_cache_server: &Url,
flakehub_flake_name: Option<String>,
flakehub_flake_name: &Option<String>,
store: Arc<NixStore>,
using_dnixd: bool,
auth_method: &super::FlakeHubAuthSource,
) -> Result<State> {
if using_dnixd {
let dnixd_state_dir: &Path = Path::new(&crate::DETERMINATE_STATE_DIR);
let expected_netrc_path = dnixd_state_dir.join("netrc");
if flakehub_api_server_netrc != expected_netrc_path {
let err = format!("flakehub-api-server-netrc was ({}), expected ({}) since determinate-nixd is available", flakehub_api_server_netrc.display(), expected_netrc_path.display());
return Err(error::Error::Config(err));
}
}
// Parse netrc to get the credentials for api.flakehub.com.
let netrc = {
let mut netrc_file = File::open(flakehub_api_server_netrc).await.map_err(|e| {
Error::Internal(format!(
"Failed to open {}: {}",
flakehub_api_server_netrc.display(),
e
))
})?;
let mut netrc_contents = String::new();
netrc_file
.read_to_string(&mut netrc_contents)
.await
.map_err(|e| {
Error::Internal(format!(
"Failed to read {} contents: {}",
flakehub_api_server_netrc.display(),
e
))
})?;
netrc_rs::Netrc::parse(netrc_contents, false).map_err(Error::Netrc)?
};
let netrc_path = auth_method.as_path_buf();
let NetrcInfo {
netrc,
flakehub_cache_server_hostname,
flakehub_login,
flakehub_password,
} = extract_info_from_netrc(&netrc_path, flakehub_api_server, flakehub_cache_server).await?;
let flakehub_netrc_entry = {
netrc
.machines
.iter()
.find(|machine| {
machine.name.as_ref() == flakehub_api_server.host().map(|x| x.to_string()).as_ref()
})
.ok_or_else(|| Error::MissingCreds(flakehub_api_server.to_string()))?
.to_owned()
};
let flakehub_cache_server_hostname = flakehub_cache_server
.host()
.ok_or_else(|| Error::BadUrl(flakehub_cache_server.to_owned()))?
.to_string();
let flakehub_login = flakehub_netrc_entry.login.as_ref().ok_or_else(|| {
Error::Config(format!(
"netrc file does not contain a login for '{}'",
flakehub_api_server
))
})?;
let flakehub_password = flakehub_netrc_entry.password.ok_or_else(|| {
Error::Config(format!(
"netrc file does not contain a password for '{}'",
flakehub_api_server
))
})?;
if !using_dnixd {
if let super::FlakeHubAuthSource::Netrc(netrc_path) = auth_method {
// Append an entry for the FlakeHub cache server to netrc.
if !netrc
.machines
@ -109,12 +58,12 @@ pub async fn init_cache(
let mut netrc_file = tokio::fs::OpenOptions::new()
.create(false)
.append(true)
.open(flakehub_api_server_netrc)
.open(netrc_path)
.await
.map_err(|e| {
Error::Internal(format!(
"Failed to open {} for appending: {}",
flakehub_api_server_netrc.display(),
netrc_path.display(),
e
))
})?;
@ -131,7 +80,7 @@ pub async fn init_cache(
.map_err(|e| {
Error::Internal(format!(
"Failed to write credentials to {}: {}",
flakehub_api_server_netrc.display(),
netrc_path.display(),
e
))
})?;
@ -149,23 +98,40 @@ pub async fn init_cache(
// Periodically refresh JWT in GitHub Actions environment
if environment.is_github_actions() {
// NOTE(cole-h): This is a workaround -- at the time of writing, GitHub Actions JWTs are only
// valid for 5 minutes after being issued. FlakeHub uses these JWTs for authentication, which
// means that after those 5 minutes have passed and the token is expired, FlakeHub (and by
// extension FlakeHub Cache) will no longer allow requests using this token. However, GitHub
// gives us a way to repeatedly request new tokens, so we utilize that and refresh the token
// every 2 minutes (less than half of the lifetime of the token).
let netrc_path_clone = flakehub_api_server_netrc.to_path_buf();
let initial_github_jwt_clone = flakehub_password.clone();
let flakehub_cache_server_clone = flakehub_cache_server.to_string();
let api_clone = api.clone();
match auth_method {
super::FlakeHubAuthSource::Netrc(path) => {
let netrc_path_clone = path.to_path_buf();
let initial_github_jwt_clone = flakehub_password.clone();
let flakehub_cache_server_clone = flakehub_cache_server.to_string();
let api_clone = api.clone();
tokio::task::spawn(refresh_github_actions_jwt_worker(
netrc_path_clone,
initial_github_jwt_clone,
flakehub_cache_server_clone,
api_clone,
));
tokio::task::spawn(refresh_github_actions_jwt_worker(
netrc_path_clone,
initial_github_jwt_clone,
flakehub_cache_server_clone,
api_clone,
));
}
crate::FlakeHubAuthSource::DeterminateNixd => {
let api_clone = api.clone();
let netrc_file = PathBuf::from(DETERMINATE_NETRC_PATH);
let flakehub_api_server_clone = flakehub_api_server.clone();
let flakehub_cache_server_clone = flakehub_cache_server.clone();
let initial_meta = tokio::fs::metadata(&netrc_file).await.map_err(|e| {
Error::Io(e, format!("getting metadata of {}", netrc_file.display()))
})?;
let initial_inode = initial_meta.ino();
tokio::task::spawn(refresh_determinate_token_worker(
netrc_file,
initial_inode,
flakehub_api_server_clone,
flakehub_cache_server_clone,
api_clone,
));
}
}
}
// Get the cache UUID for this project.
@ -244,6 +210,72 @@ pub async fn init_cache(
Ok(state)
}
#[derive(Debug)]
struct NetrcInfo {
netrc: netrc_rs::Netrc,
flakehub_cache_server_hostname: String,
flakehub_login: String,
flakehub_password: String,
}
#[tracing::instrument]
async fn extract_info_from_netrc(
netrc_path: &Path,
flakehub_api_server: &Url,
flakehub_cache_server: &Url,
) -> Result<NetrcInfo> {
let netrc = {
let mut netrc_file = File::open(netrc_path).await.map_err(|e| {
Error::Internal(format!("Failed to open {}: {}", netrc_path.display(), e))
})?;
let mut netrc_contents = String::new();
netrc_file
.read_to_string(&mut netrc_contents)
.await
.map_err(|e| {
Error::Internal(format!(
"Failed to read {} contents: {}",
netrc_path.display(),
e
))
})?;
netrc_rs::Netrc::parse(netrc_contents, false).map_err(Error::Netrc)?
};
let flakehub_netrc_entry = netrc
.machines
.iter()
.find(|machine| {
machine.name.as_ref() == flakehub_api_server.host().map(|x| x.to_string()).as_ref()
})
.ok_or_else(|| Error::MissingCreds(flakehub_api_server.to_string()))?
.to_owned();
let flakehub_cache_server_hostname = flakehub_cache_server
.host()
.ok_or_else(|| Error::BadUrl(flakehub_cache_server.to_owned()))?
.to_string();
let flakehub_login = flakehub_netrc_entry.login.ok_or_else(|| {
Error::Config(format!(
"netrc file does not contain a login for '{}'",
flakehub_api_server
))
})?;
let flakehub_password = flakehub_netrc_entry.password.ok_or_else(|| {
Error::Config(format!(
"netrc file does not contain a password for '{}'",
flakehub_api_server
))
})?;
Ok(NetrcInfo {
netrc,
flakehub_cache_server_hostname,
flakehub_login,
flakehub_password,
})
}
pub async fn enqueue_paths(state: &State, store_paths: Vec<StorePath>) -> Result<()> {
state.push_session.queue_many(store_paths)?;
@ -259,6 +291,13 @@ async fn refresh_github_actions_jwt_worker(
flakehub_cache_server_clone: String,
api: Arc<RwLock<ApiClient>>,
) -> Result<()> {
// NOTE(cole-h): This is a workaround -- at the time of writing, GitHub Actions JWTs are only
// valid for 5 minutes after being issued. FlakeHub uses these JWTs for authentication, which
// means that after those 5 minutes have passed and the token is expired, FlakeHub (and by
// extension FlakeHub Cache) will no longer allow requests using this token. However, GitHub
// gives us a way to repeatedly request new tokens, so we utilize that and refresh the token
// every 2 minutes (less than half of the lifetime of the token).
// TODO(cole-h): this should probably be half of the token's lifetime ((exp - iat) / 2), but
// getting this is nontrivial so I'm not going to do it until GitHub changes the lifetime and
// breaks this.
@ -377,3 +416,77 @@ async fn rewrite_github_actions_token(
Ok(new_github_jwt_string)
}
#[tracing::instrument(skip_all)]
async fn refresh_determinate_token_worker(
netrc_file: PathBuf,
mut inode: u64,
flakehub_api_server: Url,
flakehub_cache_server: Url,
api_clone: Arc<RwLock<ApiClient>>,
) {
// NOTE(cole-h): This is a workaround -- at the time of writing, determinate-nixd handles the
// GitHub Actions JWT refreshing for us, which means we don't know when this will happen. At the
// moment, it does it roughly every 2 minutes (less than half of the total lifetime of the
// issued token).
loop {
tokio::time::sleep(std::time::Duration::from_secs(3)).await;
let meta = tokio::fs::metadata(&netrc_file)
.await
.map_err(|e| Error::Io(e, format!("getting metadata of {}", netrc_file.display())));
let Ok(meta) = meta else {
tracing::error!(e = ?meta);
continue;
};
let current_inode = meta.ino();
if current_inode == inode {
tracing::debug!("current inode is the same, file didn't change");
continue;
}
tracing::debug!("current inode is different, file changed");
inode = current_inode;
let flakehub_password = match extract_info_from_netrc(
&netrc_file,
&flakehub_api_server,
&flakehub_cache_server,
)
.await
{
Ok(NetrcInfo {
flakehub_password, ..
}) => flakehub_password,
Err(e) => {
tracing::error!(?e, "Failed to extract auth info from netrc");
continue;
}
};
let server_config = ServerConfig {
endpoint: flakehub_cache_server.to_string(),
token: Some(attic_client::config::ServerTokenConfig::Raw {
token: flakehub_password,
}),
};
let new_api = ApiClient::from_server_config(server_config.clone());
let Ok(new_api) = new_api else {
tracing::error!(e = ?new_api, "Failed to construct new ApiClient");
continue;
};
{
let mut api_client = api_clone.write().await;
*api_client = new_api;
}
tracing::debug!("Stored new token in API client, sleeping for 30s");
}
}

View file

@ -37,7 +37,15 @@ impl GhaCache {
metrics: Arc<telemetry::TelemetryReport>,
narinfo_negative_cache: Arc<RwLock<HashSet<String>>>,
) -> Result<GhaCache> {
let mut api = Api::new(credentials)?;
let cb_metrics = metrics.clone();
let mut api = Api::new(
credentials,
Arc::new(Box::new(move || {
cb_metrics
.tripped_429
.store(true, std::sync::atomic::Ordering::Relaxed);
})),
)?;
if let Some(cache_version) = &cache_version {
api.mutate_version(cache_version.as_bytes());

View file

@ -23,7 +23,7 @@ mod telemetry;
mod util;
use std::collections::HashSet;
use std::fs::{self, create_dir_all};
use std::fs::create_dir_all;
use std::io::Write;
use std::net::SocketAddr;
use std::path::{Path, PathBuf};
@ -45,6 +45,7 @@ use gha_cache::Credentials;
const DETERMINATE_STATE_DIR: &str = "/nix/var/determinate";
const DETERMINATE_NIXD_SOCKET_NAME: &str = "determinate-nixd.socket";
const DETERMINATE_NETRC_PATH: &str = "/nix/var/determinate/netrc";
// TODO(colemickens): refactor, move with other UDS stuff (or all PBH stuff) to new file
#[derive(Clone, Debug, Serialize, Deserialize)]
@ -59,13 +60,6 @@ type State = Arc<StateInner>;
/// GitHub Actions-powered Nix binary cache
#[derive(Parser, Debug)]
struct Args {
/// JSON file containing credentials.
///
/// If this is not specified, credentials will be loaded
/// from the environment.
#[arg(short = 'c', long)]
credentials_file: Option<PathBuf>,
/// Address to listen on.
///
/// FIXME: IPv6
@ -117,11 +111,11 @@ struct Args {
/// Whether to use the GHA cache.
#[arg(long)]
use_gha_cache: bool,
use_gha_cache: Option<Option<CacheTrinary>>,
/// Whether to use the FlakeHub binary cache.
#[arg(long)]
use_flakehub: bool,
use_flakehub: Option<Option<CacheTrinary>>,
/// URL to which to post startup notification.
#[arg(long)]
@ -136,15 +130,48 @@ struct Args {
diff_store: bool,
}
#[derive(Debug, Clone, Copy, PartialEq, clap::ValueEnum)]
pub enum CacheTrinary {
NoPreference,
Enabled,
Disabled,
}
impl From<Option<Option<CacheTrinary>>> for CacheTrinary {
fn from(b: Option<Option<CacheTrinary>>) -> Self {
match b {
None => CacheTrinary::NoPreference,
Some(None) => CacheTrinary::Enabled,
Some(Some(v)) => v,
}
}
}
#[derive(PartialEq, Clone, Copy)]
pub enum Dnixd {
Available,
Missing,
}
impl From<bool> for Dnixd {
fn from(b: bool) -> Self {
if b {
Dnixd::Available
} else {
Dnixd::Missing
}
}
}
impl Args {
fn validate(&self, environment: env::Environment) -> Result<(), error::Error> {
if environment.is_gitlab_ci() && self.use_gha_cache {
if environment.is_gitlab_ci() && self.github_cache_preference() == CacheTrinary::Enabled {
return Err(error::Error::Config(String::from(
"the --use-gha-cache flag should not be applied in GitLab CI",
)));
}
if environment.is_gitlab_ci() && !self.use_flakehub {
if environment.is_gitlab_ci() && self.flakehub_preference() != CacheTrinary::Enabled {
return Err(error::Error::Config(String::from(
"you must set --use-flakehub in GitLab CI",
)));
@ -152,6 +179,14 @@ impl Args {
Ok(())
}
fn github_cache_preference(&self) -> CacheTrinary {
self.use_gha_cache.into()
}
fn flakehub_preference(&self) -> CacheTrinary {
self.use_flakehub.into()
}
}
fn default_nix_conf() -> PathBuf {
@ -193,6 +228,26 @@ struct StateInner {
original_paths: Option<Mutex<HashSet<PathBuf>>>,
}
#[derive(Debug, Clone)]
pub(crate) enum FlakeHubAuthSource {
DeterminateNixd,
Netrc(PathBuf),
}
impl FlakeHubAuthSource {
pub(crate) fn as_path_buf(&self) -> PathBuf {
match &self {
Self::Netrc(path) => path.clone(),
Self::DeterminateNixd => {
let mut path = PathBuf::from(DETERMINATE_STATE_DIR);
path.push("netrc");
path
}
}
}
}
async fn main_cli() -> Result<()> {
let guard = init_logging()?;
let _tracing_guard = guard.appender_guard;
@ -206,9 +261,9 @@ async fn main_cli() -> Result<()> {
let dnixd_uds_socket_dir: &Path = Path::new(&DETERMINATE_STATE_DIR);
let dnixd_uds_socket_path = dnixd_uds_socket_dir.join(DETERMINATE_NIXD_SOCKET_NAME);
let dnixd_available = dnixd_uds_socket_path.exists();
let dnixd_available: Dnixd = dnixd_uds_socket_path.exists().into();
let nix_conf_path: PathBuf = args.nix_conf;
let nix_conf_path: PathBuf = args.nix_conf.clone();
// NOTE: we expect this to point to a user nix.conf
// we always open/append to it to be able to append the extra-substituter for github-actions cache
@ -222,47 +277,85 @@ async fn main_cli() -> Result<()> {
.open(&nix_conf_path)
.with_context(|| "Creating nix.conf")?;
// always enable fallback, first
nix_conf
.write_all(b"fallback = true\n")
.with_context(|| "Setting fallback in nix.conf")?;
let store = Arc::new(NixStore::connect()?);
let narinfo_negative_cache = Arc::new(RwLock::new(HashSet::new()));
let flakehub_state = if args.use_flakehub {
let flakehub_cache_server = args.flakehub_cache_server;
let flakehub_auth_method: Option<FlakeHubAuthSource> = match (
args.flakehub_preference(),
&args.flakehub_api_server_netrc,
dnixd_available,
) {
// User has explicitly pyassed --use-flakehub=disabled, so just straight up don't
(CacheTrinary::Disabled, _, _) => {
tracing::info!("Disabling FlakeHub cache.");
None
}
let flakehub_api_server_netrc = if dnixd_available {
let dnixd_netrc_path = PathBuf::from(DETERMINATE_STATE_DIR).join("netrc");
args.flakehub_api_server_netrc.unwrap_or(dnixd_netrc_path)
} else {
args.flakehub_api_server_netrc.ok_or_else(|| {
anyhow!(
"--flakehub-api-server-netrc is required when determinate-nixd is unavailable"
)
})?
};
// User has no preference, did not pass a netrc, and determinate-nixd is not available
(CacheTrinary::NoPreference, None, Dnixd::Missing) => None,
// Use it when determinate-nixd is available, and let the user know what's going on
(pref, user_netrc_path, Dnixd::Available) => {
if pref == CacheTrinary::NoPreference {
tracing::info!("Enabling FlakeHub cache because determinate-nixd is available.");
}
if user_netrc_path.is_some() {
tracing::info!("Ignoring the user-specified --flakehub-api-server-netrc, in favor of the determinate-nixd netrc");
}
Some(FlakeHubAuthSource::DeterminateNixd)
}
// When determinate-nixd is not available, but the user specified a netrc
(_, Some(path), Dnixd::Missing) => {
if path.exists() {
Some(FlakeHubAuthSource::Netrc(path.to_owned()))
} else {
tracing::debug!(path = %path.display(), "User-provided netrc does not exist");
None
}
}
// User explicitly turned on flakehub cache, but we have no netrc and determinate-nixd is not present
(CacheTrinary::Enabled, None, Dnixd::Missing) => {
return Err(anyhow!(
"--flakehub-api-server-netrc is required when determinate-nixd is unavailable"
));
}
};
let flakehub_state = if let Some(auth_method) = flakehub_auth_method {
let flakehub_cache_server = &args.flakehub_cache_server;
let flakehub_api_server = &args.flakehub_api_server;
let flakehub_flake_name = args.flakehub_flake_name;
let flakehub_flake_name = &args.flakehub_flake_name;
match flakehub::init_cache(
environment,
flakehub_api_server,
&flakehub_api_server_netrc,
&flakehub_cache_server,
flakehub_cache_server,
flakehub_flake_name,
store.clone(),
dnixd_available,
&auth_method,
)
.await
{
Ok(state) => {
if !dnixd_available {
if let FlakeHubAuthSource::Netrc(ref path) = auth_method {
nix_conf
.write_all(
format!(
"extra-substituters = {}?trusted=1\nnetrc-file = {}\n",
&flakehub_cache_server,
flakehub_api_server_netrc.display()
path.display()
)
.as_bytes(),
)
@ -273,38 +366,24 @@ async fn main_cli() -> Result<()> {
Some(state)
}
Err(err) => {
tracing::debug!("FlakeHub cache initialization failed: {}", err);
tracing::error!("FlakeHub cache initialization failed: {}. Unable to authenticate to FlakeHub. Individuals must register at FlakeHub.com; Organizations must create an organization at FlakeHub.com.", err);
println!("::error title={{FlakeHub: Unauthenticated}}::{{Unable to authenticate to FlakeHub. Individuals must register at FlakeHub.com; Organizations must create an organization at FlakeHub.com.}}");
None
}
}
} else {
tracing::info!(
"FlakeHub cache is disabled, as the `use-flakehub` setting is set to `false`."
);
tracing::info!("FlakeHub cache is disabled.");
None
};
let gha_cache = if args.use_gha_cache {
let credentials = if let Some(credentials_file) = &args.credentials_file {
tracing::info!("Loading credentials from {:?}", credentials_file);
let bytes = fs::read(credentials_file).with_context(|| {
format!(
"Failed to read credentials file '{}'",
credentials_file.display()
)
})?;
let gha_cache = if (args.github_cache_preference() == CacheTrinary::Enabled)
|| (args.github_cache_preference() == CacheTrinary::NoPreference
&& flakehub_state.is_none())
{
tracing::info!("Loading credentials from environment");
serde_json::from_slice(&bytes).with_context(|| {
format!(
"Failed to deserialize credentials file '{}'",
credentials_file.display()
)
})?
} else {
tracing::info!("Loading credentials from environment");
Credentials::load_from_env()
.with_context(|| "Failed to load credentials from environment (see README.md)")?
};
let credentials = Credentials::load_from_env()
.with_context(|| "Failed to load credentials from environment (see README.md)")?;
let gha_cache = gha::GhaCache::new(
credentials,
@ -352,7 +431,7 @@ async fn main_cli() -> Result<()> {
original_paths,
});
if dnixd_available {
if dnixd_available == Dnixd::Available {
tracing::info!("Subscribing to Determinate Nixd build events.");
crate::pbh::subscribe_uds_post_build_hook(dnixd_uds_socket_path, state.clone()).await?;
} else {

View file

@ -186,13 +186,7 @@ pub async fn setup_legacy_post_build_hook(
/* Update nix.conf. */
nix_conf
.write_all(
format!(
"fallback = true\npost-build-hook = {}\n",
post_build_hook_script.display()
)
.as_bytes(),
)
.write_all(format!("post-build-hook = {}\n", post_build_hook_script.display()).as_bytes())
.with_context(|| "Writing to nix.conf")?;
Ok(())

View file

@ -28,6 +28,8 @@ pub struct TelemetryReport {
pub num_original_paths: Metric,
pub num_final_paths: Metric,
pub num_new_paths: Metric,
pub tripped_429: std::sync::atomic::AtomicBool,
}
#[derive(Debug, Default, serde::Serialize)]

View file

@ -1,48 +0,0 @@
diff --git a/mk/libraries.mk b/mk/libraries.mk
index 6541775f329..5118b957608 100644
--- a/mk/libraries.mk
+++ b/mk/libraries.mk
@@ -130,7 +130,15 @@ define build-library
$(1)_LDFLAGS_USE += $$($(1)_PATH) $$($(1)_LDFLAGS)
- $(1)_INSTALL_PATH := $$(libdir)/$$($(1)_NAME).a
+ $(1)_INSTALL_PATH := $(DESTDIR)$$($(1)_INSTALL_DIR)/$$($(1)_NAME).a
+
+ $$(eval $$(call create-dir, $$($(1)_INSTALL_DIR)))
+
+ $$($(1)_INSTALL_PATH): $$($(1)_OBJS) | $(DESTDIR)$$($(1)_INSTALL_DIR)/
+ +$$(trace-ld) $(LD) -Ur -o $$(_d)/$$($(1)_NAME).o $$^
+ $$(trace-ar) $(AR) crs $$@ $$(_d)/$$($(1)_NAME).o
+
+ install: $$($(1)_INSTALL_PATH)
endif
diff --git a/src/libstore/local.mk b/src/libstore/local.mk
index 8f28bec6c1d..0d41e3c2cac 100644
--- a/src/libstore/local.mk
+++ b/src/libstore/local.mk
@@ -69,6 +69,13 @@ $(d)/build.cc:
clean-files += $(d)/schema.sql.gen.hh $(d)/ca-specific-schema.sql.gen.hh
+$(d)/nix-store.pc: $(d)/nix-store.pc.in
+ $(trace-gen) rm -f $@ && ./config.status --quiet --file=$@
+ifeq ($(BUILD_SHARED_LIBS), 1)
+ sed -i 's|@LIBS_PRIVATE@||' $@
+else
+ sed -i 's|@LIBS_PRIVATE@|Libs.private: $(libstore_LDFLAGS) $(libstore_LDFLAGS_PROPAGATED) $(foreach lib, $(libstore_LIBS), $($(lib)_LDFLAGS))|' $@
+endif
$(eval $(call install-file-in, $(d)/nix-store.pc, $(libdir)/pkgconfig, 0644))
$(foreach i, $(wildcard src/libstore/builtins/*.hh), \
diff --git a/src/libstore/nix-store.pc.in b/src/libstore/nix-store.pc.in
index 6d67b1e0380..738991d307b 100644
--- a/src/libstore/nix-store.pc.in
+++ b/src/libstore/nix-store.pc.in
@@ -7,3 +7,4 @@ Description: Nix Package Manager
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lnixstore -lnixutil
Cflags: -I${includedir}/nix -std=c++2a
+@LIBS_PRIVATE@