Compare commits

...

688 Commits

Author SHA1 Message Date
Salman Chishti 6b63a2bfc3
Merge pull request #2176 from actions/prepare-exec-2.0.0-release
Prepare @actions/exec 2.0.0 release
2025-11-18 14:13:24 +00:00
Salman Muin Kayser Chishti 290017ff81 update package json 2025-11-04 13:53:28 +00:00
Salman Chishti 2a876cd69d
Update packages/exec/RELEASES.md
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-04 13:50:24 +00:00
Salman Muin Kayser Chishti f79b906406 Prepare @actions/exec 2.0.0 release 2025-10-31 15:55:29 +00:00
Salman Chishti dcae869a03
Merge pull request #2167 from actions/prepare-http-client-3.0.0-release
Prepare @actions/http-client 3.0.0 release
2025-10-31 15:27:38 +00:00
Salman Chishti 23769d04c7
Merge pull request #2166 from actions/prepare-io-2.0.0-release
Prepare @actions/io 2.0.0 release
2025-10-31 15:27:26 +00:00
Daniel Kennedy d3ab50471b
Merge pull request #2168 from actions/danwkennedy/prepare-4.0.0
Artifact: prepare `v4.0.0`
2025-10-24 13:38:36 -04:00
Daniel Kennedy 1388fd1cac Artifact: prepare `4.0.0` 2025-10-24 13:28:26 -04:00
Bassem Dghaidi 5b446d2657
Merge pull request #2165 from austenstone/max-list-artifact-2k
fix: artifact pagination bugs and configurable artifact count limits
2025-10-24 17:23:44 +02:00
Austen Stone 006d6978c1 linting 2025-10-22 11:44:21 -04:00
Austen Stone 02afeb1577 style: wrap ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT env var assignment for readability 2025-10-22 11:42:01 -04:00
Salman Chishti d47594b536
Update packages/http-client/RELEASES.md
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-22 15:44:09 +01:00
Salman Muin Kayser Chishti 2823824b94 Prepare @actions/http-client 3.0.0 release 2025-10-22 15:40:17 +01:00
Austen Stone cbc06d6766 fix: ensure max artifact count variable is treated as a string 2025-10-22 07:47:51 -04:00
Austen Stone 9bb6708527
fix: remove redundant check for max artifact count variable 2025-10-22 07:41:31 -04:00
Austen Stone be1151df02
Apply suggestion from @Link-
Co-authored-by: Bassem Dghaidi <568794+Link-@users.noreply.github.com>
2025-10-22 07:39:45 -04:00
Salman Muin Kayser Chishti 130842f4e8 Prepare @actions/io 2.0.0 release 2025-10-21 15:55:10 +01:00
Salman Chishti ab82301c62
Merge pull request #2164 from actions/prepare-attest-2.0.0-release
Prepare @actions/attest 2.0.0 release
2025-10-21 15:08:28 +01:00
Austen Stone fea4f6b5c5 fix: resolve critical pagination bugs and add comprehensive testing
- Fix off-by-one error in pagination loop (< to <=) that prevented fetching last page
- Add Math.ceil() to maxNumberOfPages calculation for proper limit handling
- Replace hardcoded 2000 limit with configurable getMaxArtifactListCount()
- Add pagination test for multi-page artifact listing
- Add environment variable test for ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT
- Add comprehensive test coverage for getMaxArtifactListCount() function

Fixes compound bug where pagination and limit logic capped results at 900 artifacts instead of intended 1000.
2025-10-21 09:22:11 -04:00
Salman Muin Kayser Chishti d3ade9ecfc Prepare @actions/attest 2.0.0 release 2025-10-20 12:07:20 +01:00
functionstackx fb592eec03
Update packages/artifact/src/internal/find/list-artifacts.ts
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-17 18:05:06 -04:00
functionstackx 70e79399a2
fix: bumping max list artifact to 2k 2025-10-16 23:13:34 -04:00
Eugene acb230b99a
Merge pull request #2160 from actions/ejahnGithub-patch-1
Remove unnecessary Buffer to Uint8Array conversion
2025-10-16 12:22:23 -04:00
Eugene 5e0fa1aaaa
Remove unnecessary Buffer to Uint8Array conversion
Removed unnecessary conversion of Buffer to Uint8Array for compatibility.
2025-10-16 12:08:05 -04:00
Salman Chishti ac2468e605
Support Nodejs.24 - Merge pull request #2110 from actions/salmanmkc/node24
Support Node.js 24
2025-10-16 16:25:47 +01:00
Salman Muin Kayser Chishti 3c8fcfce19 del file 2025-10-16 14:37:05 +01:00
Salman Muin Kayser Chishti 45467b9199 LInt 2025-10-16 14:34:09 +01:00
Salman Muin Kayser Chishti 700a55077d spacing 2025-10-16 14:27:39 +01:00
Salman Muin Kayser Chishti 6fa8f07827 Update based on testing to add trailing back slash to all results 2025-10-16 13:52:43 +01:00
Salman Muin Kayser Chishti d16e86a709 Add workflow to test readlink behavior on Windows across Node versions 2025-10-16 13:03:06 +01:00
Salman Muin Kayser Chishti ae3ac0db0c change back to lstat 2025-10-15 17:25:34 +01:00
Salman Muin Kayser Chishti b319d6afff Add comment to explain the method and return types 2025-10-15 17:14:54 +01:00
Salman Muin Kayser Chishti b8ac8fc14a lint 2025-10-15 17:08:33 +01:00
Salman Muin Kayser Chishti 028d621193 Merge remote-tracking branch 'origin/main' into salmanmkc/node24 2025-10-15 16:41:54 +01:00
Salman Muin Kayser Chishti b0d901f9c2 rebase led to this changing so reverting 2025-10-15 16:37:38 +01:00
Salman Muin Kayser Chishti 394e804dc8 remove skip lib check 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti d402248c45 Lint fix 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 66e8437b3e Revert "Io util package usage update"
This reverts commit 783332a4b57e9455ec3a361c4e16f659a35f3a97.
2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 9c7501a5f3 Io util package usage update 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 3b4b5725f0 Update packages, core doesn't need updates and update to use IO util update 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 9a364e607b update io utils 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 625c3f4856 change version for http-client 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 1c3a637017 Update documentation 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti ec0ca1b19b fix typo 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 57cd003e61 Update tests to use HTTPS for postman-echo.com and adjust proxy environment variable 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti b5befc6c6d Update HTTP tests to use HTTPS for postman-echo.com 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti 88a490d2ce override for node-fetch 2025-10-15 16:28:21 +01:00
Salman Muin Kayser Chishti a8d1fb0687 remove node 18 2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti 347c887e54 package json 2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti d5af54ee78 Update package versions 2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti 44b9401378 Remove the need to update packages/core 2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti fb5ae2a0e0 Keep attest at the same version 2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti 8024983ab0 Update workflows and documentation to use the latest versions of first party actions that are available (checkout, setup-node, github-script)
=
2025-10-15 16:27:29 +01:00
Salman Muin Kayser Chishti d44f9b8f13 update some version numbers, will revise in a bit 2025-10-15 16:27:29 +01:00
Daniel Kennedy 9b4ee219ef fix: only mock the `cpus()` function on the `os` module instead of the whole module 2025-10-15 16:26:39 +01:00
Daniel Kennedy ee5d8970ad Take a direct dependency on `@azure/core-http` 2025-10-15 16:26:39 +01:00
dependabot[bot] 2874e3a741 Bump the artifact-minor-patch group in /packages/artifact with 5 updates
Bumps the artifact-minor-patch group in /packages/artifact with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/core](https://github.com/actions/toolkit/tree/HEAD/packages/core) | `1.10.0` | `1.11.1` |
| [@azure/storage-blob](https://github.com/Azure/azure-sdk-for-js) | `12.15.0` | `12.28.0` |
| [@protobuf-ts/plugin](https://github.com/timostamm/protobuf-ts/tree/HEAD/packages/plugin) | `2.9.1` | `2.11.1` |
| [typedoc](https://github.com/TypeStrong/TypeDoc) | `0.25.4` | `0.28.13` |
| [typescript](https://github.com/microsoft/TypeScript) | `5.2.2` | `5.9.2` |

Updates `@actions/core` from 1.10.0 to 1.11.1
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/core/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/core)

Updates `@azure/storage-blob` from 12.15.0 to 12.28.0
- [Release notes](https://github.com/Azure/azure-sdk-for-js/releases)
- [Changelog](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Changelog-for-next-generation.md)
- [Commits](https://github.com/Azure/azure-sdk-for-js/compare/@azure/storage-blob_12.15.0...@azure/storage-blob_12.28.0)

Updates `@protobuf-ts/plugin` from 2.9.1 to 2.11.1
- [Release notes](https://github.com/timostamm/protobuf-ts/releases)
- [Commits](https://github.com/timostamm/protobuf-ts/commits/v2.11.1/packages/plugin)

Updates `typedoc` from 0.25.4 to 0.28.13
- [Release notes](https://github.com/TypeStrong/TypeDoc/releases)
- [Changelog](https://github.com/TypeStrong/typedoc/blob/master/CHANGELOG.md)
- [Commits](https://github.com/TypeStrong/TypeDoc/compare/v0.25.4...v0.28.13)

Updates `typescript` from 5.2.2 to 5.9.2
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release-publish.yml)
- [Commits](https://github.com/microsoft/TypeScript/compare/v5.2.2...v5.9.2)

---
updated-dependencies:
- dependency-name: "@actions/core"
  dependency-version: 1.11.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: "@azure/storage-blob"
  dependency-version: 12.28.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: "@protobuf-ts/plugin"
  dependency-version: 2.11.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: typedoc
  dependency-version: 0.28.13
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: typescript
  dependency-version: 5.9.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-15 16:25:54 +01:00
Daniel Kennedy ad4afeeff1 Update the group names 2025-10-15 16:24:47 +01:00
Daniel Kennedy f9bdf6a054 Dependabot: add support for `/packages/artifact` and `/packages/cache 2025-10-15 16:24:47 +01:00
Bassem Dghaidi 59c7ebde79 Prepapre cache v4.1.0 release 2025-10-15 16:24:47 +01:00
Ryan Ghadimi 0c907a43d3 no need to resolve 2025-10-15 16:23:02 +01:00
Ryan Ghadimi d1c1fc4108 lint 2025-10-15 16:23:02 +01:00
Ryan Ghadimi 36f30e6d37 new error state, tests to cover 2025-10-15 16:23:02 +01:00
Ryan Ghadimi 308e05bc50 remove cache size limit 2025-10-15 16:23:02 +01:00
Salman Muin Kayser Chishti 33a9b6c09c update with dist updates 2025-10-15 16:22:51 +01:00
Daniel Kennedy ddc5fa4ae8
Merge pull request #2133 from actions/danwkennedy/test-blob-stream-timeout
Test: add a timeout test for downloading chunks from the stream
2025-09-25 10:54:19 -04:00
Daniel Kennedy 9b08f07cd3 Fix linting 2025-09-25 09:26:13 -04:00
Daniel Kennedy d26e9423f4 Test: add a timeout test for downloading chunks from the stream 2025-09-25 09:11:38 -04:00
Daniel Kennedy 714f93aedc
Merge pull request #2124 from akashchi/reject-on-download-failure
[ARTIFACT] Reject download promise if timeout was reached
2025-09-25 09:06:20 -04:00
Andrei Kashchikhin 844423665b lint 2025-09-25 10:53:34 +02:00
Daniel Kennedy f2ba502b92
Merge pull request #2136 from actions/dependabot/npm_and_yarn/packages/artifact/artifact-minor-patch-612b72ffd4
Bump the artifact-minor-patch group in /packages/artifact with 5 updates
2025-09-24 20:21:48 -04:00
Daniel Kennedy 1db3130eb3 fix: only mock the `cpus()` function on the `os` module instead of the whole module 2025-09-24 20:01:53 -04:00
Daniel Kennedy ca8a35d78f Take a direct dependency on `@azure/core-http` 2025-09-24 16:46:53 -04:00
dependabot[bot] f7f057193f
Bump the artifact-minor-patch group in /packages/artifact with 5 updates
Bumps the artifact-minor-patch group in /packages/artifact with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@actions/core](https://github.com/actions/toolkit/tree/HEAD/packages/core) | `1.10.0` | `1.11.1` |
| [@azure/storage-blob](https://github.com/Azure/azure-sdk-for-js) | `12.15.0` | `12.28.0` |
| [@protobuf-ts/plugin](https://github.com/timostamm/protobuf-ts/tree/HEAD/packages/plugin) | `2.9.1` | `2.11.1` |
| [typedoc](https://github.com/TypeStrong/TypeDoc) | `0.25.4` | `0.28.13` |
| [typescript](https://github.com/microsoft/TypeScript) | `5.2.2` | `5.9.2` |


Updates `@actions/core` from 1.10.0 to 1.11.1
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/core/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/core)

Updates `@azure/storage-blob` from 12.15.0 to 12.28.0
- [Release notes](https://github.com/Azure/azure-sdk-for-js/releases)
- [Changelog](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Changelog-for-next-generation.md)
- [Commits](https://github.com/Azure/azure-sdk-for-js/compare/@azure/storage-blob_12.15.0...@azure/storage-blob_12.28.0)

Updates `@protobuf-ts/plugin` from 2.9.1 to 2.11.1
- [Release notes](https://github.com/timostamm/protobuf-ts/releases)
- [Commits](https://github.com/timostamm/protobuf-ts/commits/v2.11.1/packages/plugin)

Updates `typedoc` from 0.25.4 to 0.28.13
- [Release notes](https://github.com/TypeStrong/TypeDoc/releases)
- [Changelog](https://github.com/TypeStrong/typedoc/blob/master/CHANGELOG.md)
- [Commits](https://github.com/TypeStrong/TypeDoc/compare/v0.25.4...v0.28.13)

Updates `typescript` from 5.2.2 to 5.9.2
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release-publish.yml)
- [Commits](https://github.com/microsoft/TypeScript/compare/v5.2.2...v5.9.2)

---
updated-dependencies:
- dependency-name: "@actions/core"
  dependency-version: 1.11.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: "@azure/storage-blob"
  dependency-version: 12.28.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: "@protobuf-ts/plugin"
  dependency-version: 2.11.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: typedoc
  dependency-version: 0.28.13
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
- dependency-name: typescript
  dependency-version: 5.9.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: artifact-minor-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-24 20:07:58 +00:00
Daniel Kennedy 8e146e124e
Merge pull request #2134 from actions/danwkennedy/dependabot-artifact-cache
Dependabot: add support for `/packages/artifact` and `/packages/cache`
2025-09-24 16:06:29 -04:00
Daniel Kennedy 1ea77a84d7 Update the group names 2025-09-24 16:04:16 -04:00
Daniel Kennedy 7da95b182e Dependabot: add support for `/packages/artifact` and `/packages/cache 2025-09-24 16:00:44 -04:00
Andrei Kashchikhin 7c689a5156 use error in both reject and destroy 2025-09-24 17:05:25 +02:00
Andrei Kashchikhin 8c6c662cda Merge remote-tracking branch 'upstream/main' into reject-on-download-failure 2025-09-24 17:03:35 +02:00
Bassem Dghaidi 3898ed70c4
Merge pull request #2132 from actions/Link-/cache-4.1.0
Prepare cache `v4.1.0` release
2025-09-24 14:35:45 +02:00
Bassem Dghaidi 9a41b33065 Prepapre cache v4.1.0 release 2025-09-24 05:23:58 -07:00
Salman Muin Kayser Chishti 7aea3e735f changes 2025-09-08 15:37:51 +01:00
Salman Muin Kayser Chishti b1eb18b224 http 2025-09-08 15:36:39 +01:00
Salman Muin Kayser Chishti 48e42b1fdd linting 2025-09-04 15:24:57 +01:00
Salman Muin Kayser Chishti b738f10ef3 package updates 2025-09-04 15:15:02 +01:00
Salman Muin Kayser Chishti 8f32f385e0 Bump package versions, and fix issues 2025-09-04 14:16:27 +01:00
Salman Muin Kayser Chishti 011f07d1dc package changes 2025-09-04 12:58:54 +01:00
Salman Muin Kayser Chishti aa7077acfb Override to fix npm audit stuff 2025-09-04 12:49:31 +01:00
Salman Muin Kayser Chishti 86207b5042 remove engines 24 reuqirement from toolkit and fix test 2025-09-04 12:41:43 +01:00
Andrei Kashchikhin 523ce8ccda add reject 2025-09-01 11:52:11 +02:00
Ryan Ghadimi f58042f9cc
Merge pull request #2118 from actions/ghadimir/cache_size_restriction
Remove 10GB Cache Size Limit for Cache Service V2
2025-08-21 15:14:38 +01:00
Ryan Ghadimi 091616a0b8 no need to resolve 2025-08-13 13:38:51 +00:00
Ryan Ghadimi 8da1e670b6 lint 2025-08-13 13:37:36 +00:00
Ryan Ghadimi 06f7fd9df1 new error state, tests to cover 2025-08-13 13:00:46 +00:00
Ryan Ghadimi 0fe20e9d56 remove cache size limit 2025-08-13 10:14:18 +00:00
Salman Muin Kayser Chishti f82db4c00b audit fix 2025-08-08 12:26:34 +01:00
Salman Muin Kayser Chishti b8cca0c71f fix lint errors 2025-08-08 04:02:29 +01:00
Salman Muin Kayser Chishti 6f0cb0c45e Merge branch 'main' into salmanmkc/node24 2025-08-08 03:54:30 +01:00
Salman Muin Kayser Chishti 944ede4d09 custom readlink implementation for Windows compatibility with trailing backslashes 2025-08-08 03:46:42 +01:00
Bassem Dghaidi 227b1ce741
Merge pull request #2115 from actions/Link-/release-4.0.5
Prepare release `4.0.5`
2025-08-07 13:08:34 +02:00
Bassem Dghaidi 447ee85f36 Prepare release 4.0.5 2025-08-07 04:00:47 -07:00
Bassem Dghaidi a6be3de743
Merge pull request #2114 from actions/Link-/fix-cache-tests
Update cache package compilation step to only install runtime dependencies
2025-08-07 12:58:32 +02:00
Bassem Dghaidi 26b94036cb Merge branch 'Link-/fix-cache-tests' of github.com:actions/toolkit into Link-/fix-cache-tests 2025-08-07 03:50:04 -07:00
Bassem Dghaidi f3e6fb165e Fix linter complaints 2025-08-07 03:49:51 -07:00
Bassem Dghaidi 3a607d0f00
Update .github/workflows/cache-tests.yml
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-08-07 12:40:20 +02:00
Bassem Dghaidi c9316bb4a7
Update cache package compilation step to install only runtime dependencies 2025-08-07 03:38:19 -07:00
Bassem Dghaidi ec43e5810d
Merge pull request #2113 from actions/Link-/fix-runtime-deps
Reintroduce `@protobuf-ts/runtime-rpc` as a runtime dependency
2025-08-07 12:31:57 +02:00
Bassem Dghaidi 01715621b0 Replace @protobuf-ts/runtime with higher level dep @protobuf-ts/runtime-rpc 2025-08-07 03:21:24 -07:00
Bassem Dghaidi 6c64260c6d
Reintroduce @protobuf-ts/runtime as a runtime dependency v2.11.1 2025-08-07 03:15:33 -07:00
Bassem Dghaidi bf3fc9226a
Merge pull request #2111 from actions/Link-/cache-4.0.4
Prepare `@actions/cache` 4.0.4
2025-08-06 21:11:24 +02:00
Bassem Dghaidi c6723084aa Prepare release 4.0.4 2025-08-06 11:37:53 -07:00
Salman Muin Kayser Chishti bcb928642f format 2025-08-06 12:57:10 +01:00
Salman Muin Kayser Chishti 8c3fc9ed99 Update test to use mock 2025-08-06 12:49:50 +01:00
Salman Muin Kayser Chishti 1ef3214cee update for types 2025-08-01 11:50:25 +01:00
Salman Muin Kayser Chishti ece2273b24 updates 2025-07-31 23:48:44 +01:00
Salman Muin Kayser Chishti 717b895584 support node 24 2025-07-31 23:37:22 +01:00
Bassem Dghaidi 8ff772deb1
Merge pull request #2106 from actions/Link-/optimise-cache-deps
Move `@protobuf-ts/plugin` to dev dependencies
2025-07-31 14:03:14 +02:00
Bassem Dghaidi 8a3652e16d Optimise cache dependencies 2025-07-31 04:20:29 -07:00
Salman Chishti eb6226501b
Merge pull request #2076 from esainane/what-the-word-is-is
Fix typo in `core/README.md`
2025-07-31 12:03:28 +01:00
Bassem Dghaidi d65ee66d9b Move @protobuf-ts/plugin to dev dependencies 2025-07-28 07:58:41 -07:00
Bassem Dghaidi 6d3feab2bf
Merge pull request #2100 from actions/copilot/fix-2099
Improve cache service availability determination and implement conditional error logging
2025-07-28 16:49:10 +02:00
Bassem Dghaidi 79e1d8bb74
Merge pull request #2101 from actions/Link-/clarify-cache-hit-log
Explicit logging of cache key and restore key matches
2025-07-18 15:20:45 +02:00
copilot-swe-agent[bot] a0907ed2e2 Remove .nx/ from .gitignore as requested
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 13:49:49 +00:00
copilot-swe-agent[bot] bd54a2413a Fix v1 cache service to only check ACTIONS_CACHE_URL
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 13:39:13 +00:00
copilot-swe-agent[bot] 89397db14b Restore server error test and confirm logCacheError function removal
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 13:01:02 +00:00
copilot-swe-agent[bot] d48d6b62a4 Remove logCacheError function and implement inline 5xx error detection as requested
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 12:42:37 +00:00
copilot-swe-agent[bot] bab3dcf7f3 Complete PR feedback implementation: all cache tests passing
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 12:26:40 +00:00
copilot-swe-agent[bot] c51178a15e Implement 5xx server error detection and fix most cache tests
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 12:23:44 +00:00
copilot-swe-agent[bot] bbc6082700 Add .nx/ to .gitignore to exclude build cache files 2025-07-14 12:09:41 +00:00
copilot-swe-agent[bot] cf3aaeb491 Update tests to expect warnings instead of errors for non-5xx cache failures
Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 12:07:37 +00:00
Bassem Dghaidi cf4886cccb Fix linting issues 2025-07-14 03:49:28 -07:00
Bassem Dghaidi 0c5da92b52 Fix logging of cache key and restore key matches 2025-07-14 03:45:17 -07:00
copilot-swe-agent[bot] 513216f1dd Fix tests to expect errors instead of warnings for cache failures
- Update restoreCacheV2.test.ts, restoreCache.test.ts, saveCacheV2.test.ts, and saveCache.test.ts
- Change test expectations from core.warning to core.error for cache operation failures
- All tests now pass successfully

Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 10:38:47 +00:00
copilot-swe-agent[bot] 3c90578c30 Improve cache service availability determination and change warnings to errors
- Update isFeatureAvailable() to leverage ACTIONS_CACHE_SERVICE_V2 feature flag
- For v2: check ACTIONS_RESULTS_URL availability
- For v1: check either ACTIONS_CACHE_URL or ACTIONS_RESULTS_URL availability
- Change warning logs to error logs for cache failures
- Add comprehensive tests covering all scenarios

Co-authored-by: Link- <568794+Link-@users.noreply.github.com>
2025-07-14 10:32:34 +00:00
copilot-swe-agent[bot] be5a2ce677 Initial plan 2025-07-14 10:19:51 +00:00
Ben De St Paer-Gotch 683703c114
Merge pull request #2086 from actions/nebuk89-patch-1
Update README.md
2025-06-16 10:06:48 +01:00
Ben De St Paer-Gotch c28e7d4d5f
Update README.md
Co-authored-by: Remy Suen <remy.suen@gmail.com>
2025-06-12 10:28:03 +01:00
Ben De St Paer-Gotch 12e323ae30
Update README.md 2025-06-10 16:39:47 +01:00
Sai Nane dbb1ea35ff Fix typo in `core/README.md` 2025-05-27 04:27:17 +00:00
Brian DeHamer f31c2921c1
Merge pull request #2058 from actions/dependabot/npm_and_yarn/packages/attest/undici-5.29.0
Bump undici from 5.28.5 to 5.29.0 in /packages/attest
2025-05-25 16:30:11 -07:00
dependabot[bot] 41b3ce3141
Bump undici from 5.28.5 to 5.29.0 in /packages/attest
Bumps [undici](https://github.com/nodejs/undici) from 5.28.5 to 5.29.0.
- [Release notes](https://github.com/nodejs/undici/releases)
- [Commits](https://github.com/nodejs/undici/compare/v5.28.5...v5.29.0)

---
updated-dependencies:
- dependency-name: undici
  dependency-version: 5.29.0
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-15 16:30:57 +00:00
Josh Gross 8d8a914a94
Document `context.runAttempt` in @actions/github 6.0.1 (#2054) 2025-05-13 10:37:14 -04:00
Brian DeHamer 36db4d62ad
Merge pull request #2045 from actions/dependabot/npm_and_yarn/packages/attest/octokit/endpoint-9.0.6
Bump @octokit/endpoint from 9.0.5 to 9.0.6 in /packages/attest
2025-05-08 10:47:59 -07:00
Brian DeHamer a25b686a45
Merge pull request #2044 from actions/dependabot/npm_and_yarn/packages/attest/octokit/request-error-5.1.1
Bump @octokit/request-error from 5.1.0 to 5.1.1 in /packages/attest
2025-05-08 10:47:20 -07:00
dependabot[bot] 957610a37a
Bump @octokit/request-error from 5.1.0 to 5.1.1 in /packages/attest
Bumps [@octokit/request-error](https://github.com/octokit/request-error.js) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/octokit/request-error.js/releases)
- [Commits](https://github.com/octokit/request-error.js/compare/v5.1.0...v5.1.1)

---
updated-dependencies:
- dependency-name: "@octokit/request-error"
  dependency-version: 5.1.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-08 11:19:50 +00:00
dependabot[bot] 6ed621e7d1
Bump @octokit/endpoint from 9.0.5 to 9.0.6 in /packages/attest
Bumps [@octokit/endpoint](https://github.com/octokit/endpoint.js) from 9.0.5 to 9.0.6.
- [Release notes](https://github.com/octokit/endpoint.js/releases)
- [Commits](https://github.com/octokit/endpoint.js/compare/v9.0.5...v9.0.6)

---
updated-dependencies:
- dependency-name: "@octokit/endpoint"
  dependency-version: 9.0.6
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-08 11:19:48 +00:00
Ryan Ghadimi 8007c1c535
Merge pull request #2049 from actions/ghadimir/audit_fix
NPM audit fixes
2025-05-08 12:18:34 +01:00
Ryan Ghadimi 6444290c57 release prep 2025-05-08 08:53:55 +00:00
Ryan Ghadimi f32d6bc043 bump octokit core 2025-05-08 08:42:32 +00:00
Ryan Ghadimi 2e4ab87130 artifact deps 2025-05-08 08:38:48 +00:00
Ryan Ghadimi ef199a9ab0
Merge pull request #2043 from actions/ghadimir/audit_fix
NPM Audit Fixes
2025-05-07 15:58:29 +01:00
Ryan Ghadimi 917a43eb6e bump octokit methods 2025-05-07 11:17:56 +00:00
Ryan Ghadimi 07cac0a6b3 bump gh package ver 2025-05-07 11:12:29 +00:00
Ryan Ghadimi 2046ee6d6b gh package release prep 2025-05-07 11:08:28 +00:00
Ryan Ghadimi 2b476323c4 fix packages/gh deps 2025-05-07 11:05:00 +00:00
Ryan Ghadimi aebe304a19
Merge pull request #2041 from actions/ghadimir/fix_cache_tests
Fix cache tests
2025-05-07 09:53:32 +01:00
Ryan Ghadimi e8f276a715 alphabetically order them 2025-05-07 08:31:17 +00:00
Ryan Ghadimi d156bcaa78 maybe this works instead 2025-05-06 20:22:05 +00:00
Ryan Ghadimi 5ae4c5be28 don't need that maybe 2025-05-06 20:08:50 +00:00
Ryan Ghadimi d50f1ac1b9 change url 2025-05-06 20:02:27 +00:00
Ryan Ghadimi 87cb7035bb add env variable for cache tests 2025-05-06 19:50:44 +00:00
Alisson Tenório 1b1e81526b
Update README.md (#1719) 2025-04-09 10:46:07 -04:00
Salman Chishti 525ebf0c50
Merge pull request #2004 from AbhiPrasad/patch-1
fix link in `@actions/artifact` `RELEASES.md`
2025-04-09 15:34:10 +01:00
Abhijeet Prasad 07341e11d8
fix link in `@actions/artifact` `RELEASES.md` 2025-03-26 11:22:14 -04:00
Salman Chishti 930c890727
Merge pull request #1995 from actions/salmanmkc/2-new-cache-artifacts-release
Prepare Cache v4.0.3 & Artifact v2.3.2 releases
2025-03-17 21:22:10 +00:00
Salman Chishti a410c4a9cf
remove extra brace 2025-03-17 17:14:25 +00:00
Salman Chishti 10277d48ca
Add update to release doc, as will include it in this release 2025-03-17 17:12:32 +00:00
JoannaaKL 857c61a9df
Merge pull request #1994 from gitulisca-enterprise-cloud-testing/gitulisca/log-restore-request-version
Log cache version requested on debugging message
2025-03-17 17:58:16 +01:00
Salman Chishti c40bccc9c3
Use patch instead of minor 2025-03-17 14:08:42 +00:00
Salman Chishti ff4d4afef8
shared instead of secure 2025-03-17 12:48:56 +00:00
Salman Chishti 4d4bbebd6a
update package-lock.json 2025-03-17 12:47:54 +00:00
Salman Chishti 261fcae498
change it to minor version instead of patch 2025-03-17 12:44:51 +00:00
Salman Chishti 4059d2af66
update versions for cache and artifact 2025-03-17 12:09:16 +00:00
Salman Chishti 2559a2ac8a
Merge pull request #1982 from actions/salmanmkc/obfuscate-sas
Remove logging of any SAS tokens in Actions/Cache and Actions/Artifact
2025-03-17 11:47:29 +00:00
Art Leo 514314311c
Log cache version requested 2025-03-15 10:13:43 +11:00
Salman Chishti 957d42e6c5 add encoding back with extra tests 2025-03-14 06:38:57 -07:00
Salman Chishti 39419dd8c3 don't need to url encode or set var 2025-03-14 06:21:41 -07:00
Salman Chishti d13e6311f1 fix tests 2025-03-14 04:28:22 -07:00
Salman Chishti 6876e2a664 update ts docs 2025-03-13 04:47:49 -07:00
Salman Chishti fc482662af PR feedback, back to simplified approach, no export on client as well 2025-03-13 04:23:45 -07:00
Salman Chishti abd9054c61 Log debug error when failing to decode 2025-03-12 08:14:01 -07:00
Ryan Ghadimi 253e837c4d
Merge pull request #1991 from actions/ghadimir/hash_to_digest_upload
Change hash to digest for consistent terminology across runner logs
2025-03-12 12:26:25 +00:00
Salman Chishti 3ac34ffcb7 Mask different situations, malformed URL, encoded, decoded, raw signatures, nested parameters, and moved to a utility file 2025-03-12 03:17:35 -07:00
Ryan Ghadimi 56c5a39afb
Update blob-upload.ts 2025-03-12 07:59:00 +00:00
Ryan Ghadimi 7ae578ddd1
Merge pull request #1987 from actions/ghadimir/digest_typo
Bump release version
2025-03-11 11:07:20 +00:00
Ryan Ghadimi b2d2270685 Bump package.json 2025-03-11 11:02:42 +00:00
Ryan Ghadimi 0d1d5c7687 Bump release version 2025-03-11 10:58:38 +00:00
Ryan Ghadimi 769bb0fea1
Merge pull request #1986 from actions/ghadimir/digest_typo
Fix comment on expectedHash
2025-03-11 10:57:05 +00:00
Ryan Ghadimi d7ddca4309 Fix comment on expectedHash 2025-03-11 10:52:19 +00:00
Ryan Ghadimi 8780507298
Merge pull request #1985 from actions/ghadimir/dropdown_releases
Dropdown for package when releasing
2025-03-10 15:42:45 +00:00
Ryan Ghadimi 790c56665a
Update releases.yml 2025-03-10 15:33:38 +00:00
Ryan Ghadimi 9d8017eadb
Merge pull request #1976 from actions/ghadimir/prep_artifact_release
Prepare for Artifact v2.3.0 release
2025-03-10 15:23:55 +00:00
Ryan Ghadimi 20fee3ea63
Update @actions/artifact version to 2.3.0 2025-03-10 15:12:36 +00:00
Ryan Ghadimi 7501423b6f
Update RELEASES.md for version 2.3.0 2025-03-10 15:11:43 +00:00
Ryan Ghadimi d0cc3418ea
Bump version to 2.3.0
Better semver
2025-03-10 15:11:18 +00:00
Salman Chishti 5007821c77 Remove clean script 2025-03-10 06:51:30 -07:00
Salman Chishti 47c4fa85df masks the whole URL, update tests 2025-03-10 06:47:52 -07:00
Salman Chishti 1cd2f8a538 Instead of using utility method in core lib, use method in both twirp clients 2025-03-07 06:01:25 -08:00
Ryan Ghadimi b85d4e6b38 Prepare for Artifact v2.2.3 release 2025-03-07 10:14:36 +00:00
Ryan Ghadimi dc22dc7cad
Merge pull request #1975 from actions/ghadimir/update_call_to_list_artifacts
Compare Artifact Digests
2025-03-07 09:51:05 +00:00
Ryan Ghadimi 8c05dc87d8
Change info logs to debug logs 2025-03-07 09:38:33 +00:00
Salman Chishti 884aa17886 remove these changes 2025-03-06 14:31:21 -08:00
Salman Chishti 944e6b78db Add secret and signature masking for cache and artifact packages 2025-03-06 14:25:32 -08:00
JoannaaKL d70fb49aaa
Merge pull request #1974 from actions/list-artifacts-fix
Dont skip pages
2025-03-06 09:35:57 +01:00
Ryan Ghadimi 3726c11433 Please the linter 2025-03-05 14:44:58 +00:00
Ryan Ghadimi 71b40f7024 nicer wording 2025-03-05 14:35:01 +00:00
Ryan Ghadimi 83e5e2517b Change some debug -> info for artifacts hash logging 2025-03-05 14:30:51 +00:00
Ryan Ghadimi d5c8a0fa27 Update proto artifact interface, retrieve artifact digests, return indicator of mismatch failure 2025-03-05 11:29:44 +00:00
JoannaaKL 780e24be34
Dont skip pages 2025-03-05 09:27:35 +00:00
Brian DeHamer ec9716b3cc
Merge pull request #1969 from actions/bdehamer/workflow-ref
set workflow.ref provenance field from ref claim
2025-02-26 09:50:14 -08:00
Brian DeHamer 0bc338adab
set workflow.ref provenance field from ref claim
Updates the `buildSLSAProvenancePredicate` function to populate the
`workflow.ref` field from the `ref` claim in the OIDC token.

Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-26 08:47:27 -08:00
Rob Herley 5378ea8eca
Merge pull request #1968 from actions/robherley/cache/v4.0.2
cache: prep v4.0.2 release
2025-02-25 16:00:06 -05:00
Brian DeHamer b95b593ca5
Merge pull request #1957 from actions/bdehamer/update-undici
Bump undici to v5.28.5
2025-02-25 12:54:29 -08:00
Rob Herley 4fedf471b1
cache: prep v4.0.2 release 2025-02-25 15:03:37 -05:00
Rob Herley 1b9063ee0e
Merge pull request #1966 from actions/robherley/wrap-create-cache-err
cache: wrap create failures in ReserveCacheError
2025-02-25 15:00:25 -05:00
Rob Herley d096588f08
cache: wrap create failures in ReserveCacheError 2025-02-25 12:49:08 -05:00
Yang Cao 662b9d91f5
Merge pull request #1963 from actions/yacaovsnc/release_2_2_2
Prepare artifact release 2.2.2
2025-02-20 16:29:30 -05:00
Yang Cao a62f530b6f Update package-lock.json 2025-02-20 21:20:28 +00:00
Yang Cao 2995cdf0a1 Prepare artifact release 2.2.2 2025-02-20 21:12:25 +00:00
Yang Cao f10f9c8217
Merge pull request #1962 from actions/yacaovsnc/set_default_concurrency_to_5
Default upload artifacts concurrency to 5
2025-02-20 13:56:30 -05:00
Yang Cao c26e6f3aba Default upload artifacts concurrency to 5 2025-02-20 17:03:29 +00:00
Rob Herley 2b08dc18f2
Merge pull request #1958 from actions/robherley/cache/v4.0.1
Update manifests & release notes for cache v4.0.1
2025-02-14 12:20:52 -05:00
Brian DeHamer 412108cd55
add undici to @actions/github dependencies
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-14 08:12:00 -08:00
Rob Herley 8fcec1fb58
update manifests & release notes for cache v4.0.1 2025-02-14 11:02:13 -05:00
Brian DeHamer 95e747361e
bump undici to 5.28.5
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2025-02-14 08:02:10 -08:00
Rob Herley aad39a371f
Merge pull request #1954 from actions/robherley/miss-msg
Cache miss as debug, not warning annotation
2025-02-14 10:58:45 -05:00
Rob Herley 7fe619c58c
update mocks 2025-02-14 09:42:41 -05:00
Rob Herley e6fb8f1c5d
cache miss as debug, not warning annotation 2025-02-14 09:28:01 -05:00
Rob Herley 6a942b304d
Merge pull request #1947 from actions/robherley/rm-twirp-ts
Remove runtime dependency on `twirp-ts`
2025-02-14 09:14:17 -05:00
Ehsan Hosseini 340a6b15b5
update undici package to 5.25.5 (#1942) 2025-01-28 10:14:55 -05:00
Rob Herley e0c069db55
remove runtime dependency on twirp-ts 2025-01-27 17:52:55 +00:00
Josh Gross 1f7c2c79e0
[tool-cache] Update `@actions/core` and prepare 2.0.2 release (#1872)
* Update `@actions/core` and prepare 2.0.2 release

* Include these changes in the release notes
2025-01-15 15:57:09 -05:00
Yang Cao 5e8c25d1f5
Merge pull request #1929 from actions/yacaovsnc/release_artifact_2_2_1
Prep release packages/artifact v2.2.1
2025-01-09 09:21:32 -05:00
Yang Cao 3095d112ef Prep release packages/artifact v2.2.1 2025-01-08 21:11:59 +00:00
Yang Cao 16ef1448d7
Merge pull request #1928 from actions/yacaovsnc/artifact_upload_concurrency_and_timeout
Make both upload concurrency and timeout settings configurable with env variables.
2025-01-08 16:07:30 -05:00
Yang Cao e55409315f Rename the prefix to be more specific 2025-01-08 20:32:45 +00:00
Yang Cao d4385a64a7 Concurrency has a min of 1 2025-01-08 18:14:04 +00:00
Yang Cao ede05b95d7 Make concurrency change opt-in, but can only go lower 2025-01-08 18:11:38 +00:00
Yang Cao f3c12d5561 Set default concurrency to 10 and make timeout configurable 2025-01-08 16:19:09 +00:00
Josh Gross adb9c4a7f4
Remove more unused cache APIs (#1909) 2024-12-19 13:26:19 -05:00
Josh Gross 01f21badd5
Remove more unused cache APIs 2024-12-17 14:51:57 -05:00
Josh Gross 26f8f84a96
Remove unused cache API (#1907) 2024-12-17 14:04:05 -05:00
Brian DeHamer 433f76091b
Merge pull request #1908 from actions/bdehamer/artifact-2.2.0
Prepare artifact release 2.2.0
2024-12-17 10:24:18 -08:00
Brian DeHamer 4426b4ea91
Prepare artifact release 2.2.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-12-17 10:05:45 -08:00
Brian DeHamer f522fdf89d
Merge pull request #1896 from actions/bdehamer/artifact-digest
return artifact digest on upload
2024-12-17 10:01:16 -08:00
Brian DeHamer 1e0c16f0dc
return artifact digest on upload
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-12-06 14:27:46 -08:00
Bassem Dghaidi b7a00a3203
Merge pull request #1886 from actions/Link-/cache-4.0.0
Prepare `@actions/cache` `4.0.0` release
2024-12-04 20:09:19 +01:00
Bassem Dghaidi 0827eef58f Rerun CI 2024-12-04 10:53:00 -08:00
Bassem Dghaidi cd9197e9bd Add announcement link 2024-12-04 08:23:10 -08:00
Bassem Dghaidi 72447df44c Update deprecation notice 2024-12-04 05:33:47 -08:00
Bassem Dghaidi 59845ec372 Update deprecation notice 2024-12-04 05:30:50 -08:00
Bassem Dghaidi cb001af8a3 Update README to include deprecation notice 2024-12-03 02:52:39 -08:00
Bassem Dghaidi 4498687c5e Prepare @actions/cache 4.0.0 release 2024-12-03 02:40:00 -08:00
Bassem Dghaidi a10e209c8d
Merge pull request #1882 from actions/enhance-blob-client
Enhance blob client resilience & performance
2024-12-02 20:48:46 +01:00
Bassem Dghaidi c02c929c56 Minor comment adjustments 2024-12-02 11:10:25 -08:00
Bassem Dghaidi c649df4b94 Minor comment adjustments 2024-12-02 10:55:33 -08:00
Bassem Dghaidi fb40492b6f
Merge branch 'enhance-blob-client' of github.com:actions/toolkit into enhance-blob-client 2024-12-02 10:55:00 -08:00
Bassem Dghaidi 502e8ce651 Minor comment adjustments 2024-12-02 10:53:29 -08:00
Bassem Dghaidi 3f7df8ec5a
Fix comments
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-12-02 19:46:18 +01:00
Bassem Dghaidi b24632bd80
Fix comments
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-12-02 19:46:11 +01:00
Bassem Dghaidi 792ec716de Tune upload options 2024-12-02 07:32:33 -08:00
Bassem Dghaidi 7ad18fd6bd Fix linter complaints 2024-12-02 04:24:17 -08:00
Bassem Dghaidi 87171e29ca Fix tests 2024-12-02 04:18:46 -08:00
Bassem Dghaidi a762876d6d Minor refactoring 2024-12-02 04:08:21 -08:00
Bassem Dghaidi d89855bb90 Fix upload progress bug 2024-12-02 03:55:57 -08:00
Bassem Dghaidi db1d01308c Troubleshoot 2024-12-02 03:35:20 -08:00
Bassem Dghaidi 4a272e9053 Troubleshoot 2024-12-02 03:08:05 -08:00
Bassem Dghaidi ee1c07d0aa Add error handling for failed uploads 2024-12-02 02:38:51 -08:00
Bassem Dghaidi c6f1224d30 Add progress tracking for blob uploads 2024-12-02 02:33:27 -08:00
Bassem Dghaidi 1d403c2fd8 Fix tests 2024-11-29 07:36:51 -08:00
Bassem Dghaidi 65892d5ffe Fine tune blob uploads 2024-11-29 07:09:05 -08:00
Bassem Dghaidi 8c5f6f2dc5 Force use of Azure for restoreCacheV2 2024-11-28 07:42:07 -08:00
Bassem Dghaidi 62f5f1885b Refactor saveCacheV2 to use saveCache from cacheHttpClient 2024-11-28 07:22:01 -08:00
Bassem Dghaidi eaf0083ee2 Respect download options for restore 2024-11-28 04:56:37 -08:00
Bassem Dghaidi c1fb081674
Linter fixes 2024-11-28 03:53:34 -08:00
Bassem Dghaidi df166709a3
Refactor cache upload functionality and improve test cases 2024-11-28 03:52:09 -08:00
Bassem Dghaidi c5a5de05f6 Delete download-cache 2024-11-28 03:36:32 -08:00
Bassem Dghaidi 3a128c88c3 Merge branch 'main' into enhance-blob-client 2024-11-27 08:25:51 -08:00
John Sudol 9cc30cb0d3
Add `saveCacheV2` tests (#1879) 2024-11-27 09:30:36 -05:00
Bassem Dghaidi 35d87ab129
Refactor code formatting for consistency and readability 2024-11-27 05:58:22 -08:00
Bassem Dghaidi af3981c955 Update the useragent of the old http client to pass cache version 2024-11-27 05:50:01 -08:00
Bassem Dghaidi 27e5cf2514 Replace downloadCacheFile with downloadCacheStorageSDK 2024-11-27 04:51:21 -08:00
John Sudol b050504b2d Add test case for when the uploadFile fails on the blobclient 2024-11-27 01:45:46 +00:00
John Sudol 5d0a4af70a Remove unused mock 2024-11-26 23:33:19 +00:00
John Sudol 94f18eb26e Only mock the cacheUtil methods we need 2024-11-26 23:05:11 +00:00
John Sudol 208dbe2131 PR feedback 2024-11-26 16:36:12 +00:00
John Sudol 46174ed573 run prettier 2024-11-26 00:56:07 +00:00
John Sudol 1f087496ca Add debug message for uploadResponse 2024-11-26 00:43:37 +00:00
John Sudol 8f606682c2 Add saveCacheV2 tests 2024-11-26 00:23:42 +00:00
Bassem Dghaidi 928d3e806d
Merge pull request #1876 from actions/add-restore-tests
Add `restoreCacheV2` tests
2024-11-25 21:35:31 +01:00
Bassem Dghaidi 35ede8fcf0 Add a new debug message for downloads 2024-11-25 12:08:07 -08:00
Bassem Dghaidi 4d31e1048a Add the download cache file status code to debug log 2024-11-25 07:34:52 -08:00
Bassem Dghaidi 0e321b26f4 Add the download cache file status code to debug log 2024-11-25 07:34:07 -08:00
Bassem Dghaidi 2d2513915c
Remove unused package
Co-authored-by: Rob Herley <robherley@github.com>
2024-11-25 16:13:20 +01:00
Bassem Dghaidi de236da416 Fix cache lookup scenario 2024-11-25 05:47:51 -08:00
Bassem Dghaidi 4dadd612d6 Add support for matching on restore key values 2024-11-25 05:42:50 -08:00
Bassem Dghaidi 54ac2dd012 Add cache service version debug message 2024-11-25 04:08:47 -08:00
Bassem Dghaidi 4de30f744e Add more tests for restoreCacheV2 2024-11-25 03:53:03 -08:00
Bassem Dghaidi 27dfd2c41c Merge branch 'main' into add-restore-tests 2024-11-22 10:23:10 -08:00
Bassem Dghaidi 20ed2908f1
Merge pull request #1857 from actions/neo-cache-service
Integrate cache service v2
2024-11-22 19:22:23 +01:00
Bassem Dghaidi 39d19810a8 Add restore tests 2024-11-22 09:01:59 -08:00
Bassem Dghaidi e2028d43a2 Linter fixes and remove unnecessary dependency 2024-11-21 04:05:04 -08:00
Bassem Dghaidi 267841d7bd
Add isGhes gate and refactor to clean up circular dependencies 2024-11-21 04:01:44 -08:00
Bassem Dghaidi ab58a59f33 Bump cross-spawn to 7.0.6 2024-11-20 14:02:54 -08:00
Bassem Dghaidi a1e6ef3759 Update cache service APIs & cleanup 2024-11-20 13:53:47 -08:00
Bassem Dghaidi 8616c313a2 Remove unused definitions 2024-11-14 07:11:12 -08:00
Bassem Dghaidi 3ca85474b8 Merge branch 'neo-cache-service' of github.com:actions/toolkit into neo-cache-service 2024-11-14 06:50:01 -08:00
Bassem Dghaidi 6c11d441a5
Remove unnecessary type hints 2024-11-14 06:49:55 -08:00
Bassem Dghaidi 68ab87caa2
Add check to make sure archive has been created already
Co-authored-by: Josh Gross <joshmgross@github.com>
2024-11-14 15:49:02 +01:00
Bassem Dghaidi 555b03f6fd Revert package.json 2024-11-14 06:40:10 -08:00
Bassem Dghaidi ab8110fa2f Remove unecessary packages from top level package.json 2024-11-14 06:36:42 -08:00
Bassem Dghaidi 5e9ef8532f Lint fixes 2024-11-14 04:47:27 -08:00
Bassem Dghaidi ea4bf4810a Remove unnecessary debug information 2024-11-14 04:39:30 -08:00
Bassem Dghaidi c3e354da23 Remove unnecessary debug information 2024-11-14 04:33:31 -08:00
Bassem Dghaidi 2ee77e654f Add missing function return types 2024-11-14 03:42:14 -08:00
Bassem Dghaidi 83baffc3f6
Package upgrades with security fixes 2024-11-14 03:34:32 -08:00
Bassem Dghaidi 19cdd5f210
Linter cleanups 2024-11-14 03:34:13 -08:00
Bassem Dghaidi b2557ac90c Formatting and stylistic cleanup 2024-11-14 03:22:03 -08:00
Bassem Dghaidi 69409b3acd
Fix broken test 2024-11-14 03:10:48 -08:00
Bassem Dghaidi 9dff82c727
Port dependencies & remove dependency on toolkit/artifacts 2024-11-14 03:01:04 -08:00
Bassem Dghaidi d109d9c03e
Handle ACTIONS_CACHE_SERVICE_V2 feature flag 2024-11-14 03:00:43 -08:00
Bassem Dghaidi 4e1912a3c3 Restore __tests__ 2024-11-14 02:08:24 -08:00
Bassem Dghaidi 9da70ffbd7 Post merge cleanup 2024-11-14 02:04:20 -08:00
Bassem Dghaidi 75cdb2c08f
Merge branch 'main' into neo-cache-service 2024-11-14 02:02:55 -08:00
Josh Gross bb2278e5cf
Extend Node version test coverage (#1843)
* Extend Node version test coverage

* Remove Node 16
2024-11-08 10:30:18 -05:00
Josh Gross 77f247b2f3
Prepare `@actions/cache` 3.3.0 release (#1871) 2024-11-01 13:32:42 -04:00
Brian DeHamer d13839fcf4
Merge pull request #1870 from actions/bdehamer/attest-1.5-release-notes
`@actions/attest`: Release notes for v1.5.0 release
2024-11-01 09:55:13 -07:00
Brian DeHamer 7e54468896
update release notes for @actions/attest v1.5.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 09:45:11 -07:00
Brian DeHamer 339447c5d3
Merge pull request #1863 from meriadec/attest-provenance-tags
Handle tags containing "@" character in `buildSLSAProvenancePredicate`
2024-11-01 09:35:13 -07:00
Brian DeHamer 43ce96d373
Merge pull request #1865 from actions/bdehamer/multi-subject
`@actions/attest`: Support multi-subject attestations
2024-11-01 09:33:11 -07:00
Brian DeHamer 265a5be8bc
support multi-subject attestations
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 09:08:19 -07:00
Brian DeHamer 65ee4d33af
use macos-latest-large in test/release workflows (#1869)
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-11-01 11:59:55 -04:00
Meriadec Pillet 717ba9d9a4
Handle tags containing "@" character in `buildSLSAProvenancePredicate`
When using some monorepo-related tools (like [changesets](https://github.com/changesets/changesets)),
the produced tags have a special format that includes `@` character.

For example, a `foo` package on a monorepo will produce Git tags looking
like `foo@1.0.0` if using changesets.

When used in combination with `actions/attest-build-provenance`, the
action was not properly re-crafting the tag in `buildSLSAProvenancePredicate` because
it was always splitting the workflow ref by `@` and taking the second
element.

This result in this error on CI:

```
Error: Error: Failed to persist attestation: Invalid Argument - values do not match: refs/tags/foo != refs/tags/foo@1.0.0 - https://docs.github.com/rest/repos/repos#create-an-attestation
````

This PR slightly update the logic there, and rather take "everything
located after the first '@'". This shouldn't introduce any breaking
change, while giving support for custom tags.

I've added the corresponding test case, it passes, however I couldn't
successfully run the full test suite (neither on `main`). Looking
forward for CI outcome.

Thanks in advance for the review 🙏.
2024-10-30 14:29:42 +01:00
Bassem Dghaidi 01bf918aa5 Refactoring & cleanup 2024-10-24 06:09:23 -07:00
Bassem Dghaidi 28dbd8ff93
Cleanups and package refactoring 2024-10-24 05:19:48 -07:00
Josh Gross 7f5921cddd
Document unreleased changes in `cache` and `tool-cache` (#1856) 2024-10-22 12:01:31 -04:00
Bassem Dghaidi 89354f6540
Cleanup implementation and use tarballs instead of streaming zip 2024-10-21 05:21:32 -07:00
Bassem Dghaidi d399e33060 Merge branch 'main' into neo-cache-service 2024-10-21 02:25:12 -07:00
Brian DeHamer 29d342f176
Merge pull request #1848 from actions/bdehamer/attest-prep-1-5
`@actions/attest`: prep release of @actions/attest v1.5.0
2024-10-14 12:49:33 -07:00
Brian DeHamer 72113fe791
Merge pull request #1847 from actions/bdehamer/attest-update-core
`@actions/attest`: bump @actions/core from 1.10.1 to 1.11.1
2024-10-14 12:49:15 -07:00
Brian DeHamer 7b4d9763cc
Merge pull request #1846 from actions/bdehamer/sigstore-3-0-0
`@actions/attest`: bump @sigstore/sign from 2.3.2 to 3.0.0
2024-10-14 12:48:55 -07:00
Brian DeHamer 26c752f562
prep release of @actions/attest v1.5.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:33:10 -07:00
Brian DeHamer ac1332a8e2
bump @actions/core from 1.10.1 to 1.11.1
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:16:09 -07:00
Brian DeHamer c6c5ef6b8e
bump @sigstore/sign from 2.3.2 to 3.0.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-10-14 12:06:26 -07:00
Bassem Dghaidi 4d1dedf2c7
Merge branch 'main' into neo-cache-service 2024-10-09 07:45:11 -07:00
Bassem Dghaidi 13abc95165
Port restoreCache to new service 2024-10-09 04:32:57 -07:00
Rob Herley ee93b05ee9
Merge pull request #1845 from actions/robherley/update-release-notes
Update artifact release notes
2024-10-08 14:11:08 -04:00
Rob Herley 799f8f5f3d
Update artifact release notes
Includes:
- #1815
2024-10-08 14:06:04 -04:00
Rob Herley 201b082ce1
Merge pull request #1844 from actions/robherley/artifact-2.1.11
Properly resolve relative symlinks
2024-10-08 13:08:45 -04:00
Rob Herley 49cbbbcd99
Update symlink bug fix reference number 2024-10-08 13:02:06 -04:00
Rob Herley 545e0e6b95
properly resolve relative symlinks 2024-10-08 12:35:48 -04:00
JoannaaKL c18a7d2f73
Merge pull request #1815 from mydea/fn/remove-crypto
Use native `crypto` package from node
2024-10-07 11:06:38 +02:00
Josh Gross d14afd7973
Explicitly import `crypto` (#1842)
* Explicitly import `crypto`

* Add release notes for 1.11.1

* Fix crypto mock in test

* Fix `crypto` mock

* Lint
2024-10-04 17:23:42 -04:00
Josh Gross 22a72ac3d7
Include #1551 in `@actions/core` 1.11.0 release notes (#1840) 2024-10-02 14:30:25 -04:00
Josh Gross 6ca0d9b637
Release `@actions/core v1.11.0` (#1839) 2024-10-02 13:49:03 -04:00
Rob Herley 650f7c6aa3
Merge pull request #1830 from actions/robherley/artifact-2.1.10
Fix regression, auto readlink on symlinks again
2024-10-02 13:06:15 -04:00
Josh Gross 78af634e7e
Remove dependency on `uuid` package (#1824) 2024-10-02 12:28:06 -04:00
Rob Herley 2a8f1c5ddd
bump package lock version 2024-10-01 16:43:30 -04:00
Bassem Dghaidi e62c6428e7 Fix service urls 2024-09-24 03:29:14 -07:00
Bassem Dghaidi 07e51a445e Add cache service v2 client 2024-09-24 03:17:44 -07:00
Bassem Dghaidi 70e5684b1f
Merge branch 'main' into neo-cache-service 2024-09-24 02:36:02 -07:00
Rob Herley 5a62022195
/ 2024-09-20 17:52:14 -04:00
Rob Herley 8551843690
fix assertion 2024-09-20 17:45:55 -04:00
Rob Herley d6694e491d
update release notes 2024-09-20 17:31:40 -04:00
Rob Herley 7f19a7886a
fix regression, auto readlink on symlinks again 2024-09-20 17:23:43 -04:00
Brian DeHamer 6dd369c0e6
Merge pull request #1823 from actions/bdehamer/enterprise-issuer
[@actions/attest] Fix bug with customized OIDC issuer
2024-09-05 09:17:37 -07:00
Brian DeHamer 2a07de1333
fix bug with customized oidc issuer
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-09-04 10:24:28 -07:00
Francesco Novy 2e1998fc42 update lockfile 2024-08-30 09:41:33 +02:00
Francesco Novy b7a914b73b Use native `crypto` package from node 2024-08-30 09:30:02 +02:00
Brian DeHamer 6c4e082c18
Merge pull request #1805 from actions/bdehamer/update-http-client
bump @actions/http-client from 2.2.1 to 2.2.3
2024-08-22 08:39:26 -07:00
Brian DeHamer 1e69bffbba
bump @actions/http-client from 2.2.1 to 2.2.3
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-22 07:52:03 -07:00
Thomas Boop d1aa255c7f
HTTP Client 2.2.3 Release (#1804)
* http-client 2.2.3

* fix audit

* Revert "fix audit"

724956ffa7

* update versions

* Revert "update versions"

139b3391a0

* exclude dev dependencies while we work on removing lerna
2024-08-22 10:13:36 -04:00
Brian DeHamer 7298ff3219
Merge pull request #1799 from actions/bdehamer/http-client-proxy-auth
fix encoding for proxy auth token
2024-08-21 06:41:49 -07:00
Brian DeHamer 571d782946
Merge pull request #1797 from actions/bdehamer/attester-release-notes
improve release notes for @actions/attest
2024-08-19 07:38:36 -07:00
Brian DeHamer ada9e00cda
fix encoding for proxy auth token
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 15:03:40 -07:00
Josh Gross faf9cb2ea2
Include the package name in the Publish Workflow run (#1793) 2024-08-16 16:15:14 -04:00
Brian DeHamer ac3a063583
improve release notes for @actions/attest
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 12:43:39 -07:00
Brian DeHamer 7cc96bb976
Merge pull request #1796 from actions/bdehamer/attest-issuer
derive default OIDC issuer from current tenant
2024-08-16 12:21:00 -07:00
Brian DeHamer fa6cc53297
derive default OIDC issuer from current tenant
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-16 12:07:23 -07:00
Thomas Boop f299e8ba1e
HTTP Client 2.2.2 Release (#1794)
* 2.2.2 release

* update nodes
2024-08-16 13:11:10 -04:00
Yu 1b9927d1c7
Handle Encoded URL for Proxy Username and Password in HTTP Client (#1782)
* uri-decode-fix

Signed-off-by: Yu <yu.yang@anz.com>

* http-client URLdecode fix

Signed-off-by: Yu <yu.yang@anz.com>

* http-client URLdecode test typo fix

Signed-off-by: Yu <yu.yang@anz.com>

---------

Signed-off-by: Yu <yu.yang@anz.com>
2024-08-16 12:43:10 -04:00
Brian DeHamer 279e891118
Merge pull request #1790 from actions/bdehamer/attest-headers
support for headers param in attest functions
2024-08-16 07:21:46 -07:00
Brian DeHamer 340a1033a5
support for headers param in attest functions
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-08-15 15:35:32 -07:00
Josh Gross 50f2977cce
Add glob option to ignore hidden files (#1791)
* Add glob option to ignore hidden files

* Use the basename of the file/directory to check for `.`

* Ensure the `excludeHiddenFiles` is properly copied

* Allow the root directory to be matched

* Fix description of `excludeHiddenFiles`

* Document Windows hidden attribute limitation

* Bump version

* `lint`

* Document 0.5.0 release

* Lint again
2024-08-15 17:13:49 -04:00
Thomas Boop 48a65377c0
Fix HTTP client tests (#1792)
* fix tests and update dependencies
2024-08-15 16:53:06 -04:00
Rob Herley f003268b32
Merge pull request #1786 from SMoraisAnsys/fix/chunk-timeout
refactor: set chunk timeout back to 5 minutes
2024-08-06 12:12:38 -04:00
Sébastien Morais 3a33cca851
FIX: Set chunk timeout back to 5 minutes 2024-08-06 10:27:41 +02:00
Rob Herley bb6c500939
Merge pull request #1781 from actions/robherley/artifact-2.1.9
Prep for @actions/artifact v2.1.9
2024-08-01 09:42:30 -04:00
Rob Herley 76b6e24aee
bump pkg lock 2024-07-31 10:12:04 -04:00
Rob Herley 58d14c4ef5
prep for @actions/artifact v2.1.9 2024-07-31 10:05:34 -04:00
Rob Herley 7463cf3da6
Merge pull request #1771 from rmunn/fix-too-many-open-files
Prevent "too many open files" in artifact upload
2024-07-31 09:20:36 -04:00
Brian DeHamer 90d9783552
Merge pull request #1776 from actions/bdehamer/jwks-proxy-fix
fix proxy support for jwks retrieval
2024-07-29 16:31:41 -07:00
Robin Munn 7c61054649 Remove unused import 2024-07-27 17:00:02 +07:00
Brian DeHamer b28406bd1f
fix proxy support for jwks retrieval
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-07-26 15:03:40 -07:00
Robin Munn 9517cdf52d Prevent "too many open files" in artifact upload
See https://www.archiverjs.com/docs/archiver/#file
2024-07-26 08:49:34 +07:00
Rob Herley 49927e464a
Merge pull request #1774 from actions/robherley/fix-chunk-timeout
Fix artifact upload chunk timeout logic + update tests
2024-07-25 13:52:09 -04:00
Rob Herley 3e34f6d19c
add comment for chunk timeout 2024-07-24 12:40:57 -04:00
Rob Herley 182702d2df
fix chunk timeout + update tests 2024-07-23 21:57:39 -04:00
Rob Herley 1db73622df
Merge pull request #1768 from actions/robherley/artifacts-allow-localhost
Allow localhost hostnames for artifact checks
2024-07-03 14:38:52 -04:00
Rob Herley 56832696fc
npm audit fix 2024-07-03 17:03:40 +00:00
Rob Herley 176b40a888
allow localhost hostnames for artifact checks 2024-07-03 16:55:53 +00:00
Bassem Dghaidi 4902d3a118 Add backend ids 2024-06-24 01:16:11 -07:00
Bassem Dghaidi 04d1a7ec3c Add fix cache paths 2024-06-17 03:36:06 -07:00
Bassem Dghaidi e1b7e78d60 Fix cache misses 2024-06-17 02:39:45 -07:00
Bassem Dghaidi 7640cf17c1 Fix cache misses 2024-06-17 02:35:25 -07:00
Bassem Dghaidi 8d7ed4fb57 Fix cache service url bug 2024-06-17 01:32:41 -07:00
Bassem Dghaidi 5afc042a74 Add download cache v2 2024-06-17 01:17:10 -07:00
Bassem Dghaidi 5e5faf73fc Use zlib for compression 2024-06-13 03:16:59 -07:00
Brian DeHamer 361a115e53
Merge pull request #1759 from actions/bdehamer/rekor-409
config rekor to fetch on conflict
2024-06-12 12:25:06 -07:00
Brian DeHamer dddc440d56
config rekor to fetch on conflict
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-06-12 11:57:18 -07:00
Brian DeHamer 08d6f14ea8
Merge pull request #1745 from actions/bdehamer/attest-provenance
(@actions/attest) New GHA provenance build type
2024-06-12 11:45:37 -07:00
Bassem Dghaidi 9e63a77e7a Implement cache v2 2024-06-10 12:19:52 -07:00
Bassem Dghaidi 146143a9b4 Implement cache v2 2024-06-10 11:55:28 -07:00
Bassem Dghaidi 6635d12ce0 Implement cache v2 2024-06-10 11:36:37 -07:00
Bassem Dghaidi dccc3f7f1c Fix upload mechanics 2024-06-10 11:01:01 -07:00
Bassem Dghaidi 66d5434f23
Add v2 cache upload 2024-06-10 10:56:20 -07:00
Brian DeHamer 73100a7f85
new GHA build provenance
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-06-05 14:54:34 -07:00
Brian DeHamer c6b487124a
Merge pull request #1738 from actions/bdehamer/attest-1.3.0
(@actions/attest) prepare 1.3.0 release
2024-06-05 14:53:11 -07:00
Bassem Dghaidi c8466d1fac Add twirp client 2024-05-29 08:31:54 -07:00
Bassem Dghaidi 264230c2c5 add debug 2024-05-23 09:04:37 -07:00
Bassem Dghaidi 32dbccb77b Add debug message 2024-05-23 07:25:17 -07:00
Brian DeHamer 8735a7e2da
prep 1.3.0 release of @actions/attest
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-05-21 13:11:37 -07:00
Fredrik Skogman d1df13e178
Merge pull request #1735 from kommendorkapten/dynamic-urls
Read the server url from the environment variable.
2024-05-21 07:35:07 +02:00
Fredrik Skogman d3d7736bae
Fixed a spelling error 2024-05-20 07:57:44 +02:00
Fredrik Skogman 7d18e7aa0d
PR feedback. Juse more JS idiomatic code 2024-05-20 07:52:36 +02:00
Fredrik Skogman e60694077d
Read the server url from the environment variable.
Instead of having the urls hardcoded, read them from the environment.
I opted to read from the environment variable instead of the github context
because it would be easier to test.
2024-05-16 17:00:35 +02:00
Brian DeHamer ae38557bb0
Merge pull request #1730 from actions/bdehamer/attest-readme
Update @actions/attest README
2024-05-01 11:48:55 -07:00
Brian DeHamer abb586d71e
add doc link in @actions/attest readme
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-05-01 11:30:45 -07:00
Brian DeHamer 81a73aba8b
Merge pull request #1725 from actions/bdehamer/attest-retry-persist
(@actions/attest) retry request on failure to save attestation
2024-04-24 19:59:43 -07:00
Brian DeHamer 0e8fe8af62
retry request on failure to save attestation
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-24 15:07:39 -07:00
Bethany 29885a805e
Merge pull request #1724 from actions/bethanyj28/update-unzip-stream
Use latest `unzip-stream` and `unzip.Extract`
2024-04-24 09:09:09 -04:00
bethanyj28 9eb3d3a673 lint 2024-04-23 16:10:57 -04:00
bethanyj28 6e642f628f lint 2024-04-23 16:06:02 -04:00
bethanyj28 0159bbe7f2 bump version 2024-04-23 16:03:52 -04:00
bethanyj28 476276bf98 use latest unzip-stream 2024-04-23 15:54:54 -04:00
Brian DeHamer d82fd09f99
Merge pull request #1714 from actions/bdehamer/attest-no-make-fetch-happen
(@actions/attest) remove dep on make-fetch-happen
2024-04-23 10:39:57 -07:00
Brian DeHamer 2961d73391
remove dep on make-fetch-happen
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-23 09:39:17 -07:00
Rob Herley eb1cb3649c
Merge pull request #1721 from actions/robherley/retry-502-invalid-body
artifact client: retry on non-JSON response
2024-04-19 14:02:46 -04:00
Rob Herley b384fe17ba
bump pkg version + release notes 2024-04-19 15:08:30 +00:00
Rob Herley ccb1df45d1
artifact client: retry on non-JSON response 2024-04-19 14:03:47 +00:00
eggyhead 5a736647a1
Merge pull request #1712 from actions/vmjoseph/update-archiver
Upgrading `upload-artifact` and `download-artifact` archiver package
2024-04-15 13:03:10 -07:00
Vallie Joseph 918b468a41 replacing writeFile with writeFileSync 2024-04-15 16:57:28 +00:00
Vallie Joseph 234761dc05 replacing writeFile with writeFileSync 2024-04-15 16:51:30 +00:00
Vallie Joseph fa1cb5d153 correcting imports 2024-04-15 16:49:47 +00:00
Vallie Joseph e998cf1216 cleaning up tests 2024-04-15 16:32:31 +00:00
Vallie Joseph 2bbbf928ae re-adding minor ver for now 2024-04-15 16:20:24 +00:00
Vallie Joseph fa06a1eadf removing minor ver for now 2024-04-15 16:18:41 +00:00
Vallie Joseph 5eea9e34e7 cleaning up comments and removing clear timeout outside of finaly 2024-04-15 16:08:45 +00:00
Vallie Joseph 75b5e5376d updating artifact version 2024-04-15 15:32:08 +00:00
Vallie Joseph be507421b1 . 2024-04-15 15:24:57 +00:00
Vallie Joseph 5d943d4b7f Rever http 2024-04-15 12:59:58 +00:00
Vallie Joseph 67951b1f2b Merge branch 'main' into vmjoseph/update-archiver 2024-04-15 12:18:10 +00:00
eggyhead c104cf5dc0
Merge pull request #1713 from actions/eggyhead/fix-tar-ddos-vuln
fixing https://github.com/advisories/GHSA-f5x3-32g6-xq36
2024-04-12 13:41:10 -07:00
Vallie Joseph 4fb4c6ed94 Merge branch 'eggyhead/fix-tar-ddos-vuln' into vmjoseph/update-archiver 2024-04-12 20:31:55 +00:00
eggyhead df5a794b3d fixing new-package script instruction 2024-04-10 21:48:57 +00:00
eggyhead c01bc907ed fixing https://github.com/advisories/GHSA-f5x3-32g6-xq36 2024-04-10 21:30:24 +00:00
Vallie Joseph 222733049e . 2024-04-09 21:22:40 +00:00
Vallie Joseph fa9db3c8fa wrapping timeout in try catch 2024-04-09 21:18:30 +00:00
Vallie Joseph 18a8a22c65 updating upload try catch to always call cleartimeout 2024-04-09 21:05:58 +00:00
Vallie Joseph 425f05e29d moving timer outside of uploadZipToBlobStorage 2024-04-09 21:04:29 +00:00
Vallie Joseph 90fca23920 replacing timeout 2024-04-09 20:51:12 +00:00
Vallie Joseph 0d3d3bbb40 Adding missing progress time 2024-04-09 20:40:08 +00:00
Vallie Joseph 98ce947a6c updating timeout 2024-04-09 19:38:57 +00:00
Vallie Joseph 2ed9516172 updating timeout 2024-04-09 19:24:52 +00:00
Vallie Joseph 4fc93ec115 . 2024-04-09 19:01:54 +00:00
Vallie Joseph 61d6acdeb1 updating test 2024-04-09 18:52:19 +00:00
Vallie Joseph f98ccd1e39 updating tests 2024-04-09 18:21:41 +00:00
Vallie Joseph 7f0a981b2e Revert http 2024-04-09 18:09:34 +00:00
Vallie Joseph 2e7a11c409 upgrading archiver package along with chunk timeout 2024-04-09 18:02:48 +00:00
Brian DeHamer 9ddf153e00
Merge pull request #1701 from actions/bdehamer/attest-v03-bundle
(@actions/attest) generate attestations using v0.3 bundle format
2024-04-03 13:51:26 -07:00
Brian DeHamer f8d95a85df
generate v0.3 bundles in attest package
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-04-03 12:12:26 -07:00
Brian DeHamer 59e9d284e9
Merge pull request #1693 from actions/bdehamer/oidc-provenance
(@actions/attest) build provenance statement from OIDC claims
2024-03-28 13:44:22 -07:00
Brian DeHamer 4ce4c767e2
npm audit fix
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-03-22 12:44:24 -07:00
Brian DeHamer a0e6af1e53
build provenance stmt from OIDC claims
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-03-22 12:34:42 -07:00
Bethany ef77c9d60b
Merge pull request #1683 from Smeb/fix-1579
fix #1579: add test to check getCacheVersion does not mutate arguments
2024-03-07 10:48:45 -05:00
Smeb 8fee77b04b fix #1579: add test to check getCacheVersion does not mutate arguments 2024-03-07 16:23:04 +01:00
Luke Tomlinson b807fc9c54
Update http-client to 2.2.1 (#1679) 2024-03-01 15:09:37 -05:00
Bethany 55c7a1e03d
Merge pull request #1678 from actions/bethanyj28/logging
Add info level logging for zip extract
2024-03-01 13:09:41 -05:00
bethanyj28 4799020e28 bump version 2024-03-01 13:04:16 -05:00
bethanyj28 bb420e4681 add info level logging for zip extract 2024-03-01 12:54:40 -05:00
Bethany 0c735ba79d
Merge pull request #1677 from actions/bethanyj28/update-releases
Flip releases update order
2024-02-29 12:01:04 -05:00
Bethany e918bf24ae
Update RELEASES.md 2024-02-29 10:41:57 -05:00
Bethany eea6b7f517
Update RELEASES.md 2024-02-29 10:40:22 -05:00
teatimeguest ff435e591d
Make sure RequestOptions.keepAlive is applied properly on node20 runtime (#1572) 2024-02-28 12:10:57 -05:00
Bethany df3315bbea
Merge pull request #1676 from actions/bethanyj28/flip-releases
Flip releases order
2024-02-28 10:46:45 -05:00
Bethany b7770574c2
flip releases order 2024-02-28 10:35:01 -05:00
Brian DeHamer 29bf378d97
Merge pull request #1675 from actions/provenance-permissions
fix permissions for release workflow
2024-02-26 11:40:12 -08:00
Brian DeHamer 68b042febd
fix permissions for release workflow
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 11:32:45 -08:00
Brian DeHamer c366a07d62
Merge pull request #1672 from actions/attest-v1.0.0
bump @actions/attest to 1.0.0
2024-02-26 11:13:48 -08:00
Brian DeHamer 9e5eb95517
Merge pull request #1674 from actions/npm-provenance
publish npm packages with build provenance
2024-02-26 11:13:32 -08:00
Brian DeHamer 7f96bd610d
publish npm packages with build provenance
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 10:42:33 -08:00
Thomas Boop 8f53a1d37f
Update CODEOWNERS (#1673) 2024-02-26 13:31:23 -05:00
Brian DeHamer 37a562b194
bump @actions/attest to 1.0.0
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 10:21:47 -08:00
Brian DeHamer ad1f156c7c
Merge pull request #1667 from actions/bdehamer/attest
add new @actions/attest package
2024-02-26 10:15:14 -08:00
Brian DeHamer 6079dea4c4
add new @actions/attest package
Signed-off-by: Brian DeHamer <bdehamer@github.com>
2024-02-26 08:52:20 -08:00
Bethany 437f2be56d
Merge pull request #1671 from actions/bethanyj28/update-version
Update artifacts to 2.1.3
2024-02-26 10:24:29 -05:00
bethanyj28 97c606b612 update to 2.1.3 2024-02-26 10:18:02 -05:00
Bethany 5a7faf0eb5
Merge pull request #1670 from actions/bethanyj28/fix-callback
Ensure callback is only called once
2024-02-26 10:04:37 -05:00
bethanyj28 dcc55dfd04 feedback 2024-02-26 09:56:00 -05:00
bethanyj28 902046e4d8 ensure callback is only called once 2024-02-26 09:36:35 -05:00
Bethany 88f7a7bc65
Merge pull request #1666 from actions/bethanyj28/download-path
Use `unzip.Parse` over `unzip.Extract`
2024-02-23 16:22:24 -05:00
bethanyj28 6cf4fbcef8 add a comment 2024-02-23 15:33:24 -05:00
bethanyj28 7fa864a4f4 go back to normalize) 2024-02-23 15:28:25 -05:00
Bethany f77cbc9ef7
Update packages/artifact/src/internal/download/download-artifact.ts
Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2024-02-23 15:20:01 -05:00
bethanyj28 8a1800c5da use resolve instead of normalize 2024-02-23 15:15:17 -05:00
bethanyj28 90894a8853 bump version 2024-02-23 15:03:09 -05:00
bethanyj28 614f27a4fb use stream transform 2024-02-23 14:34:39 -05:00
bethanyj28 ac84a9bee3 re-add noop logs and format + lint 2024-02-23 13:46:22 -05:00
bethanyj28 4256ea99c5 update test case and handling 2024-02-23 13:41:40 -05:00
bethanyj28 76489f433b attempt with comparing index 2024-02-23 11:59:36 -05:00
bethanyj28 e9005f7727 ensure no path traversal 2024-02-23 10:54:12 -05:00
bethanyj28 8d03fb4787 prettier 2024-02-23 08:46:56 -05:00
bethanyj28 d3301c9bc2 update path parsing 2024-02-23 08:42:23 -05:00
bethanyj28 1e326de474 use existing function 2024-02-23 08:28:37 -05:00
bethanyj28 83731e6528 remove awaits from on entry 2024-02-22 22:06:32 -05:00
bethanyj28 a24b9c0184 handle directories 2024-02-22 21:54:54 -05:00
bethanyj28 31c555afda prettier 2024-02-22 20:31:49 -05:00
bethanyj28 9dea373bba wait for upload to finish 2024-02-22 20:29:42 -05:00
bethanyj28 b956d8a4dd audit, lint, format 2024-02-22 17:55:53 -05:00
bethanyj28 81d5e48db0 update tests 2024-02-22 17:51:15 -05:00
bethanyj28 bc5b3a85ae use on entry 2024-02-22 17:16:32 -05:00
Konrad Pabjan 415c42d27c
Update workflows to use v4 actions (#1652)
* Update releases.yml to use v4 actions

* Bump all workflows
2024-02-01 12:50:47 -05:00
eggyhead e6c1cd0d8c
Merge pull request #1651 from actions/eggyhead/update-ghescheck-cache-v3.2.4
updating cache version and release to include ghes check change
2024-02-01 09:21:58 -08:00
eggyhead 39621898ff
Merge pull request #1650 from actions/eggyhead/update-ghescheck-artifacts-v2.1.1
updating artifact version and release to include ghes check change
2024-02-01 08:43:20 -08:00
eggyhead c500de6dea updating cache version and release to include ghes check change
Revert "updating cache version and release to include ghes check change"

This reverts commit 7185d8964514361b7b8dcdba1f9dd54ef24b8bdd.

updating cache version and release to include ghes check change
2024-01-31 21:23:20 +00:00
eggyhead c4f4f5ae07 updating artifact version and release to include ghes check change 2024-01-31 21:15:11 +00:00
eggyhead f1d9b4b985
Merge pull request #1648 from actions/eggyhead/ghescheck-updatehosts
Update GHES host check
2024-01-31 10:33:31 -08:00
eggyhead d134334a38 lint fixes 2024-01-31 16:51:04 +00:00
eggyhead 3b02a6fdc5 updating alowed hosts in isGhes check
updating alowed hosts in artifact ghes check

using dot prepend ghe host
2024-01-31 16:30:37 +00:00
eggyhead 1fe633e27c
Merge pull request #1627 from actions/eggyhead/hyperlinks-faq
adding hyperlinks for new section of artifacts faq
2024-01-19 08:40:40 -08:00
eggyhead 74bca717aa
Update packages/artifact/docs/faq.md
Consistent spacing in version table

Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2024-01-19 08:37:38 -08:00
eggyhead bb4505e078 yaml formatting 2024-01-18 17:36:26 +00:00
eggyhead dbfca0275d removing numbered list 2024-01-18 17:35:08 +00:00
eggyhead d01372220d bold text 2024-01-18 17:33:39 +00:00
eggyhead 8e13afa0db updating language and adding compatibility table 2024-01-18 17:32:19 +00:00
Rob Herley 4e3b068ce1
Merge pull request #1629 from actions/robherley/update-docs-2.1.0
v2.1.0 Generate docs + update release notes
2024-01-18 11:25:50 -05:00
Rob Herley 017d757dd4
update releases.md 2024-01-18 11:07:25 -05:00
eggyhead 5212cb5ed9
Merge pull request #1628 from actions/eggyhead/update-getartifact-errmessage
updating artifact not found error message
2024-01-18 08:02:52 -08:00
eggyhead cca96584eb removing newline and camelcasing GitHub 2024-01-18 15:57:21 +00:00
Rob Herley 58c2878fce
generate docs + update releases 2024-01-18 09:51:01 -05:00
Rob Herley daf23ba955
Merge pull request #1626 from actions/robherley/delete-artifacts
Add methods to delete artifacts
2024-01-18 09:46:52 -05:00
eggyhead 5016db01fe update message for internal method 2024-01-18 04:14:39 +00:00
eggyhead 30942cc4ae updating artifact not found error message to include more information and link to FAQ 2024-01-18 04:10:35 +00:00
eggyhead 98f72c3040 adding hyperlinks for new section of artifacts faq 2024-01-18 04:03:48 +00:00
eggyhead 64c0992283 adding version compatibility and retention to artifacts FAQ 2024-01-18 03:58:06 +00:00
Rob Herley 1852eb2115
more delete examples 2024-01-17 18:58:58 -05:00
Rob Herley abe0bd98df
delete example 2024-01-17 18:21:25 -05:00
Rob Herley 2ad687a32e
add integration test for delete 2024-01-17 17:54:10 -05:00
Rob Herley 2f5fb3f92b
list for correct backend ids in internal delete 2024-01-17 17:53:49 -05:00
Rob Herley 7fd71a5e13
fix typo 2024-01-17 16:56:34 -05:00
Rob Herley b62d4c91b6
add public and internal methods to delete artifacts 2024-01-17 16:18:49 -05:00
Rob Herley 1b5a6e26f4
Merge pull request #1623 from actions/robherley/update-cache-release
Updates release notes for @actions/cache v3.2.3
2024-01-10 17:40:55 -05:00
Rob Herley 7c27528ab4
Update RELEASES.md
Updates release notes for @actions/cache v3.2.3
2024-01-10 17:32:52 -05:00
Rob Herley 82e8bc69b8
Merge pull request #1622 from actions/robherley/bump-cache-version
Update cache npm package version
2024-01-10 17:29:16 -05:00
Rob Herley b9079670eb
Update cache npm package version 2024-01-10 17:05:13 -05:00
Rob Herley cab491a426
Merge pull request #1378 from MSP-Greg/00-cache-paths-dup
cache - getCacheVersion - dup paths array
2024-01-10 17:01:51 -05:00
Vallie Joseph 0389dcd5e4
updating release notes (#1620) 2024-01-10 10:43:38 -05:00
Ryan Troost 64b2775394
Merge pull request #1613 from actions/srryan/download-v4-client-blob
Update `http.client` to retry transient network hang ups
2024-01-09 16:01:39 -05:00
Vallie Joseph 439cd9b37e appeasing linter 2024-01-09 19:47:25 +00:00
Vallie Joseph c1ded1dc4d appeasing linter 2024-01-09 19:47:02 +00:00
Vallie Joseph f37c445bc5 reverting jest 2024-01-09 19:46:17 +00:00
Vallie Joseph e95bcfe359
Update jest.config.js 2024-01-09 14:44:29 -05:00
Vallie Joseph 7549d1b218 removing info logs 2024-01-09 19:42:04 +00:00
Vallie Joseph 2124ef2413 cleaning up logs 2024-01-09 19:36:26 +00:00
Vallie Joseph d617670abc updating timer; removing logs 2024-01-09 19:23:57 +00:00
Vallie Joseph 47157e5ade fixing true 2024-01-09 19:05:11 +00:00
Vallie Joseph 8a6aae0a16 updating global timeout 2024-01-09 19:03:41 +00:00
Vallie Joseph 58ec2bdcc9 increase timeout 2024-01-09 18:55:50 +00:00
Vallie Joseph e19b629130 increasing timeout 2024-01-09 18:45:26 +00:00
Vallie Joseph d63a8c4d3f updating package-json 2024-01-09 17:13:35 +00:00
Vallie Joseph 67d2d582dc adding delayed response to message body http mock 2024-01-09 16:44:12 +00:00
Vallie Joseph 9d70b8a9fb testing reject after timeout 2024-01-08 15:20:05 +00:00
Vallie Joseph 7f47ffaee2 committing v1 2023-12-22 03:51:47 +00:00
Vallie Joseph 98e1a813db testing ci 2023-12-21 20:22:20 +00:00
Vallie Joseph 0d39975814 updating test with blob timeouts 2023-12-21 18:31:01 +00:00
Vallie Joseph f482643a6e updating timeout for retries 2023-12-21 15:10:01 +00:00
bethanyj28 ff2c524611 lint and format 2023-12-21 09:25:34 -05:00
srryan ecb4df89bf remove the exit 2023-12-20 18:23:47 -05:00
srryan 03319fcffa client fixes for retries + logging 2023-12-20 18:08:00 -05:00
srryan c33724abbd update to http client 2023-12-20 15:45:19 -05:00
Rob Herley d6f3ee93b8
reject don't throw 2023-12-20 14:37:13 -05:00
Rob Herley 34a411f3c0
add timeout in between data chunks 2023-12-20 13:59:31 -05:00
Rob Herley 2d6ba67518
retry the promise 2023-12-20 13:11:04 -05:00
Yukai Chou 5430c5d848
fix typo (#1611) 2023-12-20 03:16:52 -05:00
James Renaud bc68ce94ea
chore(docs): add missing job summary documentation (#1574)
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-20 03:12:17 -05:00
srryan 78ed49ff88 update error handling abort 2023-12-19 12:46:58 -05:00
srryan c119fcd773 update optional settings for blob client 2023-12-19 12:02:10 -05:00
srryan 73babeabef add explicit options 2023-12-19 11:49:39 -05:00
Vallie Joseph bf93b54558 adding logger for blob client and response 2023-12-18 23:09:10 +00:00
srryan 0c0770ce57 cleanup 2023-12-18 17:52:55 -05:00
srryan 571bf222ee update to use blob client over http client 2023-12-18 17:11:14 -05:00
Rob Herley 68f22927e7
Merge pull request #1608 from actions/robherley/artifact-client-import
Update artifact module quick start
2023-12-14 15:46:14 -05:00
Rob Herley 11a2dd3117
update artifact module quick start 2023-12-14 15:38:49 -05:00
Rob Herley 43c63eef65
Merge pull request #1607 from actions/robherley/update-artifact-tests
Update artifact workflow tests
2023-12-13 12:47:12 -05:00
Rob Herley 6a9034d692
update artifact workflow tests 2023-12-13 12:19:14 -05:00
Rob Herley eff198be5b
Merge pull request #1605 from actions/robherley/usage-message
Better error message for artifact usage limits
2023-12-12 09:49:55 -05:00
Rob Herley 16b786a545
better error message for usage limits 2023-12-11 22:01:08 -05:00
Rob Herley 18ce228b82
Merge pull request #1603 from actions/robherley/network-errors
Add specific messages for network-specific node error codes
2023-12-11 17:34:24 -05:00
Rob Herley a4bd0f1214
Add specific messages for network-specific node error codes 2023-12-11 17:07:48 -05:00
Rob Herley 37a66ebd47
Merge pull request #1602 from actions/robherley/replace-unzip-lib
[artifact] replace unzipper with unzip-stream
2023-12-11 14:22:07 -05:00
Rob Herley 09249a72d7
push null at end of mocked message 2023-12-11 13:41:11 -05:00
Rob Herley 4c531c013a
update packages 2023-12-11 12:24:41 -05:00
Rob Herley 3c3af56b29
replace unzipper with unzip-stream 2023-12-11 12:15:40 -05:00
Vallie Joseph 950e1711a1
Improve error messages (duplicate artifacts; too many artifacts) (#1600)
* cleaning up error messages

* updating package-json

* updating package-lock

* .

* .

* testing return message

* updating error check

* adding test

* rmv unused var

* updating status code to match conflict message
2023-12-11 11:26:54 -05:00
Jonathan Tamsut 88b76de595
Add back 429 to list of retryable requests (#1599)
* add back 429 to list of retryable requests

* fix lint error
2023-12-08 11:00:44 -08:00
Jonathan Tamsut 55a05255d7
Remove 429 request from list of retry-able status codes (#1597)
* remove 429 request from retryable

* remove 413

* make linter happy
2023-12-07 13:22:17 -08:00
Rob Herley 64d1b104d0
Generate Typescript Docs for `@actions/artifact` (#1595)
* autogenerate artifact documentation

* clean up comments for better autogen docs
2023-12-07 09:57:20 -08:00
Rob Herley 43ccaf05d9
Merge pull request #1596 from actions/robherley/cleanup-handlers
Cleanup artifact handlers hanging node process
2023-12-06 19:27:30 -05:00
Rob Herley f732e4cd62
linter 2023-12-06 23:57:33 +00:00
Rob Herley 8c317a0e59
one too many parses 2023-12-06 23:51:16 +00:00
Rob Herley 715b1acc05
cleanup artifact handlers hanging node process 2023-12-06 23:42:07 +00:00
Rob Herley 207747e7af
Merge pull request #1594 from actions/robherley/artifact-docs-updates
@actions/artifact doc updates
2023-12-06 14:30:00 -05:00
Rob Herley c042a30d3d
Update packages/artifact/CONTRIBUTIONS.md
Co-authored-by: Mattia Richetto <mattiaerre@github.com>
2023-12-06 14:05:38 -05:00
Rob Herley 70cad3f635
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 13:19:38 -05:00
Rob Herley 1f87038676
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 13:19:32 -05:00
Rob Herley 8cd4434523
mention job limit 2023-12-06 17:30:13 +00:00
Rob Herley 2e6c9a1f14
pr feedback 2023-12-06 17:28:03 +00:00
Rob Herley c08a7d1b2e
Update packages/artifact/README.md
Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2023-12-06 12:19:49 -05:00
Rob Herley 49ef8b93a8
fix typo 2023-12-06 15:38:59 +00:00
Rob Herley 19d4d9d3b2
releases.md: link to breaking v2 changes 2023-12-06 14:52:49 +00:00
Rob Herley b43b97985c
Update packages/artifact/docs/faq.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:31:55 -05:00
Rob Herley 23fb8c4782
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:31:09 -05:00
Rob Herley dc515188a8
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:53 -05:00
Rob Herley 79ace256d6
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:35 -05:00
Rob Herley 68958c2486
Update packages/artifact/README.md
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com>
2023-12-06 09:30:20 -05:00
Rob Herley 0c9621922e
add faq, update releases 2023-12-06 04:22:18 +00:00
Rob Herley 9b31b03496
more readme updates 2023-12-06 04:10:46 +00:00
Rob Herley befa19f3a8
initalize artifact client as default export 2023-12-06 04:00:07 +00:00
Rob Herley e27efe5620
readme & error updates 2023-12-05 21:55:22 +00:00
Rob Herley 449b28aee2
update contributing docs 2023-12-05 21:10:48 +00:00
Rob Herley 04945c6048
Merge pull request #1593 from actions/robherley/api-consistency
Consistent error behavior for Artifact methods
2023-12-05 15:22:16 -05:00
Rob Herley 5f152b798e
Update artifact-tests.yml 2023-12-05 13:54:14 -05:00
Rob Herley c390199be6
Update artifact-tests.yml 2023-12-05 13:51:51 -05:00
Rob Herley a3053b5cc2
fix typo 2023-12-05 18:47:37 +00:00
Rob Herley b9872153b8
update GHES warning behavior 2023-12-05 18:42:36 +00:00
Rob Herley ce9eae0785
consistent promise behavior for download artifact 2023-12-05 18:35:26 +00:00
Rob Herley d3c5f358d1
consistent promise behavior for get artifact 2023-12-05 17:56:18 +00:00
Rob Herley 75a3586061
consistent promise behavior for upload artifact 2023-12-05 17:35:46 +00:00
Rob Herley 8ac8bf1d3d
Merge pull request #1592 from actions/robherley/get-list-artifact-updates
Additional get/list artifact updates
2023-12-04 12:40:59 -05:00
Rob Herley 141b3509e4
update import 2023-12-03 21:13:55 +00:00
Rob Herley 790e6f7194
more docs 2023-12-03 20:52:36 +00:00
Rob Herley ef454f0991
add tests for list-artifacts 2023-12-03 20:48:33 +00:00
Rob Herley 86ce0b159a
get artifact tests 2023-12-03 19:43:37 +00:00
Rob Herley c11a7cdeac
wip 2023-12-03 06:24:49 +00:00
Rob Herley c94ca49c9c
ability to filter artifacts by latest 2023-12-03 05:01:20 +00:00
Rob Herley fa7657714a
fix import 2023-12-02 21:34:07 -05:00
Rob Herley c1f9d37323
updates to get/list artifacts 2023-12-02 21:18:22 -05:00
Rob Herley 8f1c589e25
Merge pull request #1591 from actions/robherley/artifact-internal-apis
Implement internal APIs for list/get/download artifacts
2023-12-01 16:17:26 -05:00
Rob Herley 281697ecbe
fix test expectations 2023-12-01 16:34:27 +00:00
Rob Herley a59f976dd4
minor fixes 2023-12-01 09:05:46 -05:00
Rob Herley 57db7a6302
more debug info 2023-12-01 03:04:10 +00:00
Rob Herley 4789a46578
make FindOptions interface more user friendly 2023-12-01 02:15:25 +00:00
Rob Herley 32549e8197
update download-artifact tests for public and internal impl 2023-12-01 01:32:45 +00:00
Rob Herley 22b7aeb707
some test updates 2023-12-01 00:31:27 +00:00
Rob Herley e9d6649a14
consume new pb wrappers 2023-11-30 19:10:07 +00:00
Rob Herley 695bf98f84
rewrite artifacts client to have public and internal implementations 2023-11-30 03:47:04 +00:00
Tingluo Huang 0787a93181
Merge pull request #1588 from sshmaxime/main
Add RUN_ATTEMPT to `@actions/github` Context class
2023-11-28 10:43:43 -05:00
Maxime Aubanel faa425440f
Add RUN_ATTEMPT to Github context 2023-11-28 16:32:10 +01:00
Rob Herley 0407266511
Merge pull request #1584 from actions/robherley/upload-v4-improvements
Increase Artifact v4 upload speed
2023-11-20 16:30:50 -05:00
Rob Herley a920781ca9
fix results url construction 2023-11-20 18:06:44 +00:00
Rob Herley 9e7201ff5b
audit fix 2023-11-20 16:51:13 +00:00
Rob Herley 3a610e848c
linter 2023-11-20 16:46:08 +00:00
Rob Herley 606ebdcf6d
extra log line for debug 2023-11-20 16:27:35 +00:00
Rob Herley 7b01731091
increase upload concurrency based on cpus, adjust highWaterMark, specify compression level 2023-11-20 15:03:58 +00:00
Nikolai Laevskii 20f826bfe7
Add platform info utilities to @actions/core (#1551)
* Introduce platform utilities into @actions/core

* Add tests for the platform helper

* Update README.md

* Update README.md with more details
2023-11-14 14:15:26 -05:00
Rob Herley fe3e7ce9a7
Merge pull request #1563 from actions/robherley/artifact-v4/sha256
Use sha256 instead of md5 for artifact v4 integrity hash
2023-10-16 13:31:00 -04:00
Rob Herley 8cd02dfabc
audit fix 2023-10-16 16:27:26 +00:00
Rob Herley 82474125c8
use sha256 instead of md5 for artifact v4 integrity hash 2023-10-16 16:20:24 +00:00
Tatyana Kostromskaya 494f12bcd9
Update dependencies in github package (#1553)
* Update octokit package

* define type for function

* fix linter

* Update github package to latest

* Update RELEASES.md
2023-10-10 16:04:42 +02:00
Tatyana Kostromskaya 797f48fcfa
Update release notes for http-client@2.2.0 (#1549) 2023-10-06 16:03:00 +02:00
Tatyana Kostromskaya c8d1588732
Merge pull request #1547 from actions/takost/update-http-client
Add function to return proxy agent dispatcher for compatibility with latest `octokit` packages
2023-10-06 14:47:16 +02:00
Tatyana Kostromskaya 13e0ce9cf7 resolve comments 2023-10-06 12:39:29 +00:00
Tatyana Kostromskaya eae1b66cb0 fix audit 2023-10-05 16:41:02 +02:00
Tatyana Kostromskaya 129f884271 fix format 2023-10-05 16:34:31 +02:00
Tatyana Kostromskaya 0faced6a0b Add function to return proxy agent dispatcher for compatibility with latest octokit 2023-10-05 16:20:26 +02:00
Patrick Ellis 0d63834474
Merge pull request #1541 from actions/pje/upgrade-codeql-actions-to-v2
Upgrade codeql actions to v2
2023-09-27 16:14:48 -04:00
Patrick Ellis 8f032d304a
Upgrade codeql actions to v2
Currently we're using v1, and there have been some important changes since then.

In particular, the latest version, v2.14.6, contains an important security patch:

> The CodeQL CLI no longer supports the `SEMMLE_JAVA_ARGS` environment variable. All previous versions of the CodeQL CLI perform command substitution on the `SEMMLE_JAVA_ARGS` value (for example, replacing `'$(echo foo)'` with `'foo'`) when starting a new Java virtual machine, which, depending on the execution environment, may have security implications. Users are advised to check their environments for possible `SEMMLE_JAVA_ARGS` misuse.

See the [codeql-cli-binaries release notes](https://github.com/github/codeql-cli-binaries/releases/tag/v2.14.4) for full details.
2023-09-27 15:18:59 -04:00
Tatyana Kostromskaya 28b09e224f
Merge pull request #1526 from actions/takost/upd-dependencies
Update dependencies to latest
2023-09-27 12:37:10 +02:00
Tatyana Kostromskaya 111c95866e fix test + update semver 2023-09-26 11:10:18 +00:00
Tatyana Kostromskaya ddc9c52eb6 revert octokit changes 2023-09-26 11:05:37 +00:00
Tatyana Kostromskaya 6d37c6eb2b try to fix tests 2023-09-15 15:04:21 +00:00
Tatyana Kostromskaya 6477ef1460 tests 2023-09-15 13:54:28 +00:00
Tatyana Kostromskaya 2e5b10e3bd fix tests 2023-09-15 13:45:26 +00:00
Tatyana Kostromskaya 8c1e6a00f0 try to fix test 2023-09-15 13:28:29 +00:00
Tatyana Kostromskaya b2d5fa216f update github package 2023-09-14 14:32:08 +00:00
Luke Tomlinson c5c786523e
@actions/core v1.10.1 (#1529) 2023-09-11 10:45:23 -04:00
Tatyana Kostromskaya ce31408ff5 Update dependencies 2023-09-08 14:29:27 +00:00
MSP-Greg 0747ab3577
cache - getCacheVersion - dup paths array 2023-03-20 18:29:46 -05:00
183 changed files with 26380 additions and 8273 deletions

View File

@ -43,7 +43,7 @@ Note that before a PR will be accepted, you must ensure:
1. In a new branch, create a new Lerna package:
```console
$ npm run create-package new-package
$ npm run new-package [name]
```
This will ask you some questions about the new package. Start with `0.0.0` as the first version (look generally at some of the other packages for how the package.json is structured).

27
.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,27 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "npm"
directory: "/packages/artifact"
schedule:
interval: "daily"
groups:
# Group minor and patch updates together but keep major separate
artifact-minor-patch:
update-types:
- "minor"
- "patch"
- package-ecosystem: "npm"
directory: "/packages/cache"
schedule:
interval: "daily"
groups:
# Group minor and patch updates together but keep major separate
cache-minor-patch:
update-types:
- "minor"
- "patch"

View File

@ -1,5 +1,3 @@
# Temporarily disabled while v2.0.0 of @actions/artifact is under development
name: artifact-unit-tests
on:
push:
@ -12,8 +10,8 @@ on:
- '**.md'
jobs:
build:
name: Build
upload:
name: Upload
strategy:
matrix:
@ -24,12 +22,12 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
# Need root node_modules because certain npm packages like jest are configured for the entire repository and it won't be possible
# without these to just compile the artifacts package
@ -42,53 +40,49 @@ jobs:
npm run tsc
working-directory: packages/artifact
- name: Set artifact file contents
shell: bash
run: |
echo "file1=hello from file 1" >> $GITHUB_ENV
echo "file2=hello from file 2" >> $GITHUB_ENV
- name: Create files that will be uploaded
run: |
mkdir artifact-path
echo '${{ env.file1 }}' > artifact-path/first.txt
echo '${{ env.file2 }}' > artifact-path/second.txt
echo -n 'hello from file 1' > artifact-path/first.txt
echo -n 'hello from file 2' > artifact-path/second.txt
- name: Upload Artifacts using actions/github-script@v6
uses: actions/github-script@v6
- name: Upload Artifacts
uses: actions/github-script@v8
with:
script: |
const artifact = require('./packages/artifact/lib/artifact')
const {default: artifact} = require('./packages/artifact/lib/artifact')
const artifactName = 'my-artifact-${{ matrix.runs-on }}'
console.log('artifactName: ' + artifactName)
const fileContents = ['artifact-path/first.txt','artifact-path/second.txt']
const uploadResult = await artifact.create().uploadArtifact(artifactName, fileContents, './')
const uploadResult = await artifact.uploadArtifact(artifactName, fileContents, './')
console.log(uploadResult)
const success = uploadResult.success
const size = uploadResult.size
const id = uploadResult.id
if (!success) {
throw new Error('Failed to upload artifact')
} else {
console.log(`Successfully uploaded artifact ${id}`)
console.log(`Successfully uploaded artifact ${id}`)
try {
await artifact.uploadArtifact(artifactName, fileContents, './')
throw new Error('should have failed second upload')
} catch (err) {
console.log('Successfully blocked second artifact upload')
}
verify:
name: Verify and Delete
runs-on: ubuntu-latest
needs: [build]
needs: [upload]
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
# Need root node_modules because certain npm packages like jest are configured for the entire repository and it won't be possible
# without these to just compile the artifacts package
@ -101,35 +95,100 @@ jobs:
npm run tsc
working-directory: packages/artifact
- name: List artifacts using actions/github-script@v6
uses: actions/github-script@v6
- name: List and Download Artifacts
uses: actions/github-script@v8
with:
script: |
const artifact = require('./packages/artifact/lib/artifact')
const {default: artifactClient} = require('./packages/artifact/lib/artifact')
const workflowRunId = process.env.GITHUB_RUN_ID
const repository = process.env.GITHUB_REPOSITORY
const repositoryOwner = repository.split('/')[0]
const repositoryName = repository.split('/')[1]
const {readFile} = require('fs/promises')
const path = require('path')
const listResult = await artifact.create().listArtifacts(workflowRunId, repositoryOwner, repositoryName, '${{ secrets.GITHUB_TOKEN }}')
const findBy = {
repositoryOwner: process.env.GITHUB_REPOSITORY.split('/')[0],
repositoryName: process.env.GITHUB_REPOSITORY.split('/')[1],
token: '${{ secrets.GITHUB_TOKEN }}',
workflowRunId: process.env.GITHUB_RUN_ID
}
const listResult = await artifactClient.listArtifacts({latest: true, findBy})
console.log(listResult)
const artifacts = listResult.artifacts
const expected = [
'my-artifact-ubuntu-latest',
'my-artifact-windows-latest',
'my-artifact-macos-latest'
]
if (artifacts.length !== 3) {
throw new Error('Expected 3 artifacts but only found ' + artifacts.length + ' artifacts')
}
const foundArtifacts = artifacts.filter(artifact =>
expected.includes(artifact.name)
)
const artifactNames = artifacts.map(artifact => artifact.name)
if (!artifactNames.includes('my-artifact-ubuntu-latest')){
throw new Error("Expected artifact list to contain an artifact named my-artifact-ubuntu-latest but it's missing")
}
if (!artifactNames.includes('my-artifact-windows-latest')){
throw new Error("Expected artifact list to contain an artifact named my-artifact-windows-latest but it's missing")
}
if (!artifactNames.includes('my-artifact-macos-latest')){
throw new Error("Expected artifact list to contain an artifact named my-artifact-macos-latest but it's missing")
if (foundArtifacts.length !== 3) {
console.log('Unexpected length of found artifacts', foundArtifacts)
throw new Error(
`Expected 3 artifacts but found ${foundArtifacts.length} artifacts.`
)
}
console.log('Successfully listed artifacts that were uploaded')
const files = [
{name: 'artifact-path/first.txt', content: 'hello from file 1'},
{name: 'artifact-path/second.txt', content: 'hello from file 2'}
]
for (const artifact of foundArtifacts) {
const {downloadPath} = await artifactClient.downloadArtifact(artifact.id, {
path: artifact.name,
findBy
})
console.log('Downloaded artifact to:', downloadPath)
for (const file of files) {
const filepath = path.join(
process.env.GITHUB_WORKSPACE,
downloadPath,
file.name
)
console.log('Checking file:', filepath)
const content = await readFile(filepath, 'utf8')
if (content.trim() !== file.content.trim()) {
throw new Error(
`Expected file '${file.name}' to contain '${file.content}' but found '${content}'`
)
}
}
}
- name: Delete Artifacts
uses: actions/github-script@v8
with:
script: |
const {default: artifactClient} = require('./packages/artifact/lib/artifact')
const artifactsToDelete = [
'my-artifact-ubuntu-latest',
'my-artifact-windows-latest',
'my-artifact-macos-latest'
]
for (const artifactName of artifactsToDelete) {
const {id} = await artifactClient.deleteArtifact(artifactName)
}
const {artifacts} = await artifactClient.listArtifacts({latest: true})
const foundArtifacts = artifacts.filter(artifact =>
artifactsToDelete.includes(artifact.name)
)
if (foundArtifacts.length !== 0) {
console.log('Unexpected length of found artifacts:', foundArtifacts)
throw new Error(
`Expected 0 artifacts but found ${foundArtifacts.length} artifacts.`
)
}

View File

@ -18,12 +18,12 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
- name: npm install
run: npm install
@ -32,7 +32,7 @@ jobs:
run: npm run bootstrap
- name: audit tools (without allow-list)
run: npm audit --audit-level=moderate
run: npm audit --audit-level=moderate --omit dev
- name: audit packages
run: npm run audit-all

View File

@ -22,12 +22,12 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
# In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the
# node context. This runs a local action that gets and sets the necessary env variables that are needed
@ -39,9 +39,11 @@ jobs:
- name: Install root npm packages
run: npm ci
# We need to install only runtime dependencies (omit dev dependencies) to verify that what we're shipping is all
# that is needed
- name: Compile cache package
run: |
npm ci
npm ci --omit=dev
npm run tsc
working-directory: packages/cache

View File

@ -17,16 +17,16 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v5
- shell: bash
run: |
rm "C:\Program Files\Git\usr\bin\tar.exe"
- name: Set Node.js 20.x
uses: actions/setup-node@v1
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
# In order to save & restore cache from a shell script, certain env variables need to be set that are only available in the
# node context. This runs a local action that gets and sets the necessary env variables that are needed

View File

@ -20,18 +20,18 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v5
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
uses: github/codeql-action/init@v2
with:
languages: javascript
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v1
uses: github/codeql-action/autobuild@v2
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1
uses: github/codeql-action/analyze@v2

View File

@ -1,27 +1,42 @@
name: Publish NPM
run-name: Publish NPM - ${{ github.event.inputs.package }}
on:
workflow_dispatch:
inputs:
package:
type: choice
required: true
description: 'core, artifact, cache, exec, github, glob, http-client, io, tool-cache'
description: 'Which package to release'
options:
- artifact
- attest
- cache
- core
- exec
- github
- glob
- http-client
- io
- tool-cache
jobs:
test:
runs-on: macos-latest
runs-on: macos-latest-large
steps:
- name: setup repo
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: verify package exists
run: ls packages/${{ github.event.inputs.package }}
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set Node.js 24.x
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: 24.x
- name: npm install
run: npm install
@ -40,19 +55,22 @@ jobs:
working-directory: packages/${{ github.event.inputs.package }}
- name: upload artifact
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: ${{ github.event.inputs.package }}
path: packages/${{ github.event.inputs.package }}/*.tgz
publish:
runs-on: macos-latest
runs-on: macos-latest-large
needs: test
environment: npm-publish
permissions:
contents: read
id-token: write
steps:
- name: download artifact
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: ${{ github.event.inputs.package }}
@ -62,7 +80,7 @@ jobs:
NPM_TOKEN: ${{ secrets.TOKEN }}
- name: publish
run: npm publish *.tgz
run: npm publish --provenance *.tgz
- name: notify slack on failure
if: failure()

View File

@ -16,19 +16,23 @@ jobs:
strategy:
matrix:
runs-on: [ubuntu-latest, macos-latest, windows-latest]
runs-on: [ubuntu-latest, macos-latest-large, windows-latest]
# Node 20 is the currently supported stable Node version for actions - https://docs.github.com/actions/sharing-automations/creating-actions/metadata-syntax-for-github-actions#runsusing-for-javascript-actions
# Node 24 is the new version being added with support in actions runners
node-version: [20.x, 24.x]
fail-fast: false
runs-on: ${{ matrix.runs-on }}
steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Set Node.js 20.x
uses: actions/setup-node@v3
- name: Set up Node ${{ matrix.node-version }}
uses: actions/setup-node@v5
with:
node-version: 20.x
node-version: ${{ matrix.node-version }}
- name: npm install
run: npm install

View File

@ -9,7 +9,7 @@ jobs:
if: ${{ github.repository_owner == 'actions' }}
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v5
- name: Update Octokit
working-directory: packages/github
run: |

View File

@ -2,3 +2,4 @@
/packages/artifact/ @actions/artifacts-actions
/packages/cache/ @actions/actions-cache
/packages/attest/ @actions/package-security

View File

@ -24,7 +24,7 @@ The GitHub Actions ToolKit provides a set of packages to make creating actions e
Provides functions for inputs, outputs, results, logging, secrets and variables. Read more [here](packages/core)
```bash
$ npm install @actions/core
npm install @actions/core
```
<br/>
@ -33,7 +33,7 @@ $ npm install @actions/core
Provides functions to exec cli tools and process output. Read more [here](packages/exec)
```bash
$ npm install @actions/exec
npm install @actions/exec
```
<br/>
@ -42,7 +42,7 @@ $ npm install @actions/exec
Provides functions to search for files matching glob patterns. Read more [here](packages/glob)
```bash
$ npm install @actions/glob
npm install @actions/glob
```
<br/>
@ -51,7 +51,7 @@ $ npm install @actions/glob
A lightweight HTTP client optimized for building actions. Read more [here](packages/http-client)
```bash
$ npm install @actions/http-client
npm install @actions/http-client
```
<br/>
@ -60,7 +60,7 @@ $ npm install @actions/http-client
Provides disk i/o functions like cp, mv, rmRF, which etc. Read more [here](packages/io)
```bash
$ npm install @actions/io
npm install @actions/io
```
<br/>
@ -71,7 +71,7 @@ Provides functions for downloading and caching tools. e.g. setup-* actions. Rea
See @actions/cache for caching workflow dependencies.
```bash
$ npm install @actions/tool-cache
npm install @actions/tool-cache
```
<br/>
@ -80,7 +80,7 @@ $ npm install @actions/tool-cache
Provides an Octokit client hydrated with the context that the current action is being run in. Read more [here](packages/github)
```bash
$ npm install @actions/github
npm install @actions/github
```
<br/>
@ -89,7 +89,7 @@ $ npm install @actions/github
Provides functions to interact with actions artifacts. Read more [here](packages/artifact)
```bash
$ npm install @actions/artifact
npm install @actions/artifact
```
<br/>
@ -98,7 +98,16 @@ $ npm install @actions/artifact
Provides functions to cache dependencies and build outputs to improve workflow execution time. Read more [here](packages/cache)
```bash
$ npm install @actions/cache
npm install @actions/cache
```
<br/>
:lock_with_ink_pen: [@actions/attest](packages/attest)
Provides functions to write attestations for workflow artifacts. Read more [here](packages/attest)
```bash
npm install @actions/attest
```
<br/>
@ -218,9 +227,23 @@ console.log(`We can even get context data, like the repo: ${context.repo.repo}`)
```
<br/>
## Contributing
## Note
We welcome contributions. See [how to contribute](.github/CONTRIBUTING.md).
Thank you for your interest in this GitHub repo, however, right now we are not taking contributions.
We continue to focus our resources on strategic areas that help our customers be successful while making developers' lives easier. While GitHub Actions remains a key part of this vision, we are allocating resources towards other areas of Actions and are not taking contributions to this repository at this time. The GitHub public roadmap is the best place to follow along for any updates on features were working on and what stage theyre in.
We are taking the following steps to better direct requests related to GitHub Actions, including:
1. We will be directing questions and support requests to our [Community Discussions area](https://github.com/orgs/community/discussions/categories/actions)
2. High Priority bugs can be reported through Community Discussions or you can report these to our support team https://support.github.com/contact/bug-report.
3. Security Issues should be handled as per our [security.md](SECURITY.md).
We will still provide security updates for this project and fix major breaking changes during this time.
You are welcome to still raise bugs in this repo.
## Code of Conduct

View File

@ -32,7 +32,7 @@ jobs:
os: [ubuntu-16.04, windows-2019]
runs-on: ${{matrix.os}}
actions:
- uses: actions/setup-node@v3
- uses: actions/setup-node@v5
with:
version: ${{matrix.node}}
- run: |

View File

@ -18,7 +18,7 @@ e.g. To use https://github.com/actions/setup-node, users will author:
```yaml
steps:
using: actions/setup-node@v3
using: actions/setup-node@v5
```
# Define Metadata

View File

@ -8,4 +8,4 @@ module.exports = {
'^.+\\.ts$': 'ts-jest'
},
verbose: true
}
}

11165
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{
"name": "root",
"private": true,
"private": true,
"scripts": {
"audit-all": "lerna run audit-moderate",
"bootstrap": "lerna exec -- npm install",
@ -13,11 +13,11 @@
"lint": "eslint packages/**/*.ts",
"lint-fix": "eslint packages/**/*.ts --fix",
"new-package": "scripts/create-package",
"test": "jest --testTimeout 10000"
"test": "jest --testTimeout 70000"
},
"devDependencies": {
"@types/jest": "^27.0.2",
"@types/node": "^16.18.1",
"@types/jest": "^29.5.4",
"@types/node": "^24.1.0",
"@types/signale": "^1.4.1",
"concurrently": "^6.1.0",
"eslint": "^8.0.1",
@ -26,11 +26,25 @@
"eslint-plugin-jest": "^27.2.3",
"eslint-plugin-prettier": "^5.0.0",
"flow-bin": "^0.115.0",
"jest": "^27.2.5",
"lerna": "^7.1.4",
"jest": "^29.6.4",
"lerna": "^6.4.1",
"nx": "16.6.0",
"prettier": "^3.0.0",
"ts-jest": "^27.0.5",
"typescript": "^3.9.9"
"ts-jest": "^29.1.1",
"typescript": "^5.2.2"
},
"overrides": {
"semver": "^7.6.0",
"tar": "^6.2.1",
"@octokit/plugin-paginate-rest": "^9.2.2",
"@octokit/request": "^8.4.1",
"@octokit/request-error": "^5.1.1",
"@octokit/core": "^5.0.3",
"tmp": "^0.2.4",
"@types/node": "^24.1.0",
"brace-expansion": "^2.0.2",
"form-data": "^4.0.4",
"uri-js": "npm:uri-js-replace@^1.0.1",
"node-fetch": "^3.3.2"
}
}

View File

@ -1,30 +1,44 @@
# Contributions
This package is used internally by the v2+ versions of [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact). This package can also be used by other actions to interact with artifacts. Any changes or updates to this package will propagate updates to these actions so it is important that major changes or updates get properly tested.
This package is used internally by the v4 versions of [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact). This package can also be used by other actions to interact with artifacts. Any changes or updates to this package will propagate updates to these actions so it is important that major changes or updates get properly tested.
Any issues or feature requests that are related to the artifact actions should be filled in the appropriate repo.
A limited range of unit tests run as part of each PR when making changes to the artifact packages. For small contributions and fixes, they should be sufficient.
If making large changes, there are a few scenarios that should be tested.
If making large changes, there are a few scenarios that should be tested:
- Uploading very large artifacts (large artifacts get compressed using gzip so compression/decompression must be tested)
- Uploading artifacts with lots of small files (each file is uploaded with its own HTTP call, timeouts and non-success HTTP responses can be expected so they must be properly handled)
- Uploading very large artifacts
- Uploading artifacts with lots of small files
- Uploading artifacts using a self-hosted runner (uploads and downloads behave differently due to extra latency)
- Downloading a single artifact (large and small, if lots of small files are part of an artifact, timeouts and non-success HTTP responses can be expected)
- Downloading all artifacts at once
Large architectural changes can impact upload/download performance so it is important to separately run extra tests. We request that any large contributions/changes have extra detailed testing so we can verify performance and possible regressions.
It is not possible to run end-to-end tests for artifacts as part of a PR in this repo because certain env variables such as `ACTIONS_RUNTIME_URL` are only available from the context of an action as opposed to a shell script. These env variables are needed in order to make the necessary API calls.
Tests will run for every push/pull_request [via Actions](https://github.com/actions/toolkit/blob/main/.github/workflows/artifact-tests.yml).
# Testing
Any easy way to test changes is to fork the artifact actions and to use `npm link` to test your changes.
## Package tests
1. Fork the [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact) repos
2. Clone the forks locally
3. With your local changes to the toolkit repo, type `npm link` after ensuring there are no errors when running `tsc`
4. In the locally cloned fork, type `npm link @actions/artifact`
4. Create a new release for your local fork using `tsc` and `npm run release` (this will create a new `dist/index.js` file using `@vercel/ncc`)
5. Commit and push your local changes, you will then be able to test your changes with your forked action
To run unit tests for the `@actions/artifact` package:
1. Clone `actions/toolkit` locally
2. Install dependencies: `npm bootstrap`
3. Change working directory to `packages/artifact`
4. Run jest tests: `npm run test`
## Within upload-artifact or download-artifact actions
Any easy way to test changes for the official upload/download actions is to fork them, compile changes and run them.
1. For your local `actions/toolkit` changes:
1. Change directory to `packages/artifact`
2. Compile the changes: `npm run tsc`
3. Symlink your package change: `npm link`
2. Fork and clone either [upload-artifact](https://github.com/actions/upload-artifact) and [download-artifact](https://github.com/actions/download-artifact)
1. In the locally cloned fork, link to your local toolkit changes: `npm link @actions/artifact`
2. Then, compile your changes with: `npm run release`. The local `dist/index.js` should be updated with your changes.
3. Commit and push to your fork, you can then test with a `uses:` in your workflow pointed at your fork.
4. The format for the above is `<username>/<repository-name>/@<ref>`, i.e. `me/myrepo/@HEAD`

View File

@ -1,13 +1,192 @@
# `@actions/artifact`
## Usage
Interact programmatically with [Actions Artifacts](https://docs.github.com/en/actions/using-workflows/storing-workflow-data-as-artifacts).
You can use this package to interact with the Actions artifacts.
This is the core library that powers the [`@actions/upload-artifact`](https://github.com/actions/upload-artifact) and [`@actions/download-artifact`](https://github.com/actions/download-artifact) actions.
This most recently published version of this package (`1.1.1`) can be found [here](https://github.com/actions/toolkit/tree/@actions/artifact@1.1.1/packages/artifact)
## 🚧 Under construction 🚧
- [`@actions/artifact`](#actionsartifact)
- [v2 - What's New](#v2---whats-new)
- [Improvements](#improvements)
- [Breaking changes](#breaking-changes)
- [Quick Start](#quick-start)
- [Examples](#examples)
- [Upload and Download](#upload-and-download)
- [Delete an Artifact](#delete-an-artifact)
- [Downloading from other workflow runs or repos](#downloading-from-other-workflow-runs-or-repos)
- [Speeding up large uploads](#speeding-up-large-uploads)
- [Additional Resources](#additional-resources)
This package is currently undergoing a major overhaul in preparation for `v4` versions of `upload-artifact` and `download-artifact` (these Actions will use a new `2.0.0` version of `@actions/artifact` that will soon be released). The upcoming version of `@actions/artifact` will take advantage of a major re-architecture with entirely new APIs.
## v2 - What's New
The upcoming `2.0.0` package and `v4` artifact Actions aim to solve some of the major pain-points that have made artifact usage difficult up until now.
> [!IMPORTANT]
> @actions/artifact v2+, upload-artifact@v4+, and download-artifact@v4+ are not currently supported on GHES yet. The previous version of this package can be found at [this tag](https://github.com/actions/toolkit/tree/@actions/artifact@1.1.2/packages/artifact) and [on npm](https://www.npmjs.com/package/@actions/artifact/v/1.1.2).
The release of `@actions/artifact@v2` (including `upload-artifact@v4` and `download-artifact@v4`) are major changes to the backend architecture of Artifacts. They have numerous performance and behavioral improvements.
### Improvements
1. All upload and download operations are much quicker, up to 80% faster download times and 96% faster upload times in worst case scenarios.
2. Once uploaded, an Artifact ID is returned and Artifacts are immediately available in the UI and [REST API](https://docs.github.com/en/rest/actions/artifacts). Previously, you would have to wait for the run to be completed before an ID was available or any APIs could be utilized.
3. Artifacts can now be downloaded and deleted from the UI _before_ the entire workflow run finishes.
4. The contents of an Artifact are uploaded together into an _immutable_ archive. They cannot be altered by subsequent jobs. Both of these factors help reduce the possibility of accidentally corrupting Artifact files. (Digest/integrity hash coming soon in the API!)
5. This library (and `actions/download-artifact`) now support downloading Artifacts from _other_ repositories and runs if a `GITHUB_TOKEN` with sufficient `actions:read` permissions are provided.
### Breaking changes
1. Firewall rules required for self-hosted runners.
If you are using self-hosted runners behind a firewall, you must have flows open to [Actions endpoints](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github). If you cannot use wildcard rules for your firewall, see the GitHub [meta endpoint](https://api.github.com/meta) for specific endpoints.
e.g.
```bash
curl https://api.github.com/meta | jq .domains.actions
```
2. Uploading to the same named Artifact multiple times.
Due to how Artifacts are created in this new version, it is no longer possible to upload to the same named Artifact multiple times. You must either split the uploads into multiple Artifacts with different names, or only upload once.
3. Limit of Artifacts for an individual job.
Each job in a workflow run now has a limit of 10 artifacts.
## Quick Start
Install the package:
```bash
npm i @actions/artifact
```
Import the module:
```js
// ES6 module
import {DefaultArtifactClient} from '@actions/artifact'
// CommonJS
const {DefaultArtifactClient} = require('@actions/artifact')
```
Then instantiate:
```js
const artifact = new DefaultArtifactClient()
```
For a comprehensive list of classes, interfaces, functions and more, see the [generated documentation](./docs/generated/README.md).
## Examples
### Upload and Download
The most basic scenario is uploading one or more files to an Artifact, then downloading that Artifact. Downloads are based on the Artifact ID, which can be obtained in the response of `uploadArtifact`, `getArtifact`, `listArtifacts` or via the [REST API](https://docs.github.com/en/rest/actions/artifacts).
```js
const {id, size} = await artifact.uploadArtifact(
// name of the artifact
'my-artifact',
// files to include (supports absolute and relative paths)
['/absolute/path/file1.txt', './relative/file2.txt'],
{
// optional: how long to retain the artifact
// if unspecified, defaults to repository/org retention settings (the limit of this value)
retentionDays: 10
}
)
console.log(`Created artifact with id: ${id} (bytes: ${size}`)
const {downloadPath} = await artifact.downloadArtifact(id, {
// optional: download destination path. otherwise defaults to $GITHUB_WORKSPACE
path: '/tmp/dst/path',
})
console.log(`Downloaded artifact ${id} to: ${downloadPath}`)
```
### Delete an Artifact
To delete an artifact, all you need is the name.
```js
const {id} = await artifact.deleteArtifact(
// name of the artifact
'my-artifact'
)
console.log(`Deleted Artifact ID '${id}'`)
```
It also supports options to delete from other repos/runs given a github token with `actions:write` permissions on the target repository is supplied.
```js
const findBy = {
// must have actions:write permission on target repository
token: process.env['GITHUB_TOKEN'],
workflowRunId: 123,
repositoryOwner: 'actions',
repositoryName: 'toolkit'
}
const {id} = await artifact.deleteArtifact(
// name of the artifact
'my-artifact',
// options to find by other repo/owner
{ findBy }
)
console.log(`Deleted Artifact ID '${id}' from ${findBy.repositoryOwner}/ ${findBy.repositoryName}`)
```
### Downloading from other workflow runs or repos
It may be useful to download Artifacts from other workflow runs, or even other repositories. By default, the permissions are scoped so they can only download Artifacts within the current workflow run. To elevate permissions for this scenario, you must specify `options.findBy` to `downloadArtifact`.
```ts
const findBy = {
// must have actions:read permission on target repository
token: process.env['GITHUB_TOKEN'],
workflowRunId: 123,
repositoryOwner: 'actions',
repositoryName: 'toolkit'
}
await artifact.downloadArtifact(1337, {
findBy
})
// can also be used in other methods
await artifact.getArtifact('my-artifact', {
findBy
})
await artifact.listArtifacts({
findBy
})
```
### Speeding up large uploads
If you have large files that need to be uploaded (or file types that don't compress well), you may benefit from changing the compression level of the Artifact archive. NOTE: This is a tradeoff between artifact upload time and stored data size.
```ts
await artifact.uploadArtifact('my-massive-artifact', ['big_file.bin'], {
// The level of compression for Zlib to be applied to the artifact archive.
// - 0: No compression
// - 1: Best speed
// - 6: Default compression (same as GNU Gzip)
// - 9: Best compression
compressionLevel: 0
})
```
## Additional Resources
- [Releases](./RELEASES.md)
- [Contribution Guide](./CONTRIBUTIONS.md)
- [Frequently Asked Questions](./docs/faq.md)

View File

@ -1,15 +1,178 @@
# @actions/artifact Releases
### 0.1.0
### 4.0.0
- Initial release
- Add support for Node 24 [#2110](https://github.com/actions/toolkit/pull/2110)
- Fix: artifact pagination bugs and configurable artifact count limits [#2165](https://github.com/actions/toolkit/pull/2165)
- Fix: reject the promise on timeout [#2124](https://github.com/actions/toolkit/pull/2124)
- Update dependency versions
### 0.2.0
### 2.3.3
- Fixes to TCP connections not closing
- GZip file compression to speed up downloads
- Improved logging and output
- Extra documentation
- Dependency updates [#2049](https://github.com/actions/toolkit/pull/2049)
### 2.3.2
- Added masking for Shared Access Signature (SAS) artifact URLs [#1982](https://github.com/actions/toolkit/pull/1982)
- Change hash to digest for consistent terminology across runner logs [#1991](https://github.com/actions/toolkit/pull/1991)
### 2.3.1
- Fix comment typo on expectedHash. [#1986](https://github.com/actions/toolkit/pull/1986)
### 2.3.0
- Allow ArtifactClient to perform digest comparisons, if supplied. [#1975](https://github.com/actions/toolkit/pull/1975)
### 2.2.2
- Default concurrency to 5 for uploading artifacts [#1962](https://github.com/actions/toolkit/pull/1962)
### 2.2.1
- Add `ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY` and `ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS` environment variables [#1928](https://github.com/actions/toolkit/pull/1928)
### 2.2.0
- Return artifact digest on upload [#1896](https://github.com/actions/toolkit/pull/1896)
### 2.1.11
- Fixed a bug with relative symlinks resolution [#1844](https://github.com/actions/toolkit/pull/1844)
- Use native `crypto` [#1815](https://github.com/actions/toolkit/pull/1815)
### 2.1.10
- Fixed a regression with symlinks not being automatically resolved [#1830](https://github.com/actions/toolkit/pull/1830)
- Fixed a regression with chunk timeout [#1786](https://github.com/actions/toolkit/pull/1786)
### 2.1.9
- Fixed artifact upload chunk timeout logic [#1774](https://github.com/actions/toolkit/pull/1774)
- Use lazy stream to prevent issues with open file limits [#1771](https://github.com/actions/toolkit/pull/1771)
### 2.1.8
- Allows `*.localhost` domains for hostname checks for local development.
### 2.1.7
- Update unzip-stream dependency and reverted to using `unzip.Extract()`
### 2.1.6
- Will retry on invalid request responses.
### 2.1.5
- Bumped `archiver` dependency to 7.0.1
### 2.1.4
- Adds info-level logging for zip extraction
### 2.1.3
- Fixes a bug in the extract logic updated in 2.1.2
### 2.1.2
- Updated the stream extract functionality to use `unzip.Parse()` instead of `unzip.Extract()` for greater control of unzipping artifacts
### 2.1.1
- Updated `isGhes` check to include `.ghe.com` and `.ghe.localhost` as accepted hosts
### 2.1.0
- Added `ArtifactClient#deleteArtifact` to delete artifacts by name [#1626](https://github.com/actions/toolkit/pull/1626)
- Update error messaging to be more useful [#1628](https://github.com/actions/toolkit/pull/1628)
### 2.0.1
- Patch to fix transient request timeouts https://github.com/actions/download-artifact/issues/249
### 2.0.0
- Major release. Supports new Artifact backend for improved speed, reliability and behavior.
- Numerous API changes, [some breaking](./README.md#breaking-changes).
- [Blog post with more info](https://github.blog/2024-02-12-get-started-with-v4-of-github-actions-artifacts/)
### 1.1.1
- Fixed a bug in Node16 where if an HTTP download finished too quickly (<1ms, e.g. when it's mocked) we attempt to delete a temp file that has not been created yet [#1278](https://github.com/actions/toolkit/pull/1278/commits/b9de68a590daf37c6747e38d3cb4f1dd2cfb791c)
### 1.1.0
- Add `x-actions-results-crc64` and `x-actions-results-md5` checksum headers on upload [#1063](https://github.com/actions/toolkit/pull/1063)
### 1.0.2
- Update to v2.0.1 of `@actions/http-client` [#1087](https://github.com/actions/toolkit/pull/1087)
### 1.0.1
- Update to v2.0.0 of `@actions/http-client`
### 1.0.0
- Update `lockfileVersion` to `v2` in `package-lock.json` [#1009](https://github.com/actions/toolkit/pull/1009)
### 0.6.1
- Fix for failing 0 byte file uploads on Windows [#962](https://github.com/actions/toolkit/pull/962)
### 0.6.0
- Support upload from named pipes [#748](https://github.com/actions/toolkit/pull/748)
- Fixes to percentage values being greater than 100% when downloading all artifacts [#889](https://github.com/actions/toolkit/pull/889)
- Improved logging and output during artifact upload [#949](https://github.com/actions/toolkit/pull/949)
- Improvements to client-side validation for certain invalid characters not allowed during upload: [#951](https://github.com/actions/toolkit/pull/951)
- Faster upload speeds for certain types of large files by exempting gzip compression [#956](https://github.com/actions/toolkit/pull/956)
- More detailed logging when dealing with chunked uploads [#957](https://github.com/actions/toolkit/pull/957)
### 0.5.2
- Add HTTP 500 as a retryable status code for artifact upload and download.
### 0.5.1
- Bump @actions/http-client to version 1.0.11 to fix proxy related issues during artifact upload and download
### 0.5.0
- Improved retry-ability for all http calls during artifact upload and download if an error is encountered
### 0.4.2
- Improved retry-ability when a partial artifact download is encountered
### 0.4.1
- Update to latest @actions/core version
### 0.4.0
- Add option to specify custom retentions on artifacts
-
### 0.3.5
- Retry in the event of a 413 response
### 0.3.3
- Increase chunk size during upload from 4MB to 8MB
- Improve user-agent strings during API calls to help internally diagnose issues
### 0.3.2
- Fix to ensure readstreams get correctly reset in the event of a retry
### 0.3.1
- Fix to ensure temporary gzip files get correctly deleted during artifact upload
- Remove spaces as a forbidden character during upload
### 0.3.0
@ -20,77 +183,13 @@
- Clearer error message if storage quota has been reached
- Improved logging and output during artifact download
### 0.3.1
### 0.2.0
- Fix to ensure temporary gzip files get correctly deleted during artifact upload
- Remove spaces as a forbidden character during upload
- Fixes to TCP connections not closing
- GZip file compression to speed up downloads
- Improved logging and output
- Extra documentation
### 0.3.2
### 0.1.0
- Fix to ensure readstreams get correctly reset in the event of a retry
### 0.3.3
- Increase chunk size during upload from 4MB to 8MB
- Improve user-agent strings during API calls to help internally diagnose issues
### 0.3.5
- Retry in the event of a 413 response
### 0.4.0
- Add option to specify custom retentions on artifacts
### 0.4.1
- Update to latest @actions/core version
### 0.4.2
- Improved retry-ability when a partial artifact download is encountered
### 0.5.0
- Improved retry-ability for all http calls during artifact upload and download if an error is encountered
### 0.5.1
- Bump @actions/http-client to version 1.0.11 to fix proxy related issues during artifact upload and download
### 0.5.2
- Add HTTP 500 as a retryable status code for artifact upload and download.
### 0.6.0
- Support upload from named pipes [#748](https://github.com/actions/toolkit/pull/748)
- Fixes to percentage values being greater than 100% when downloading all artifacts [#889](https://github.com/actions/toolkit/pull/889)
- Improved logging and output during artifact upload [#949](https://github.com/actions/toolkit/pull/949)
- Improvements to client-side validation for certain invalid characters not allowed during upload: [#951](https://github.com/actions/toolkit/pull/951)
- Faster upload speeds for certain types of large files by exempting gzip compression [#956](https://github.com/actions/toolkit/pull/956)
- More detailed logging when dealing with chunked uploads [#957](https://github.com/actions/toolkit/pull/957)
### 0.6.1
- Fix for failing 0 byte file uploads on Windows [#962](https://github.com/actions/toolkit/pull/962)
### 1.0.0
- Update `lockfileVersion` to `v2` in `package-lock.json` [#1009](https://github.com/actions/toolkit/pull/1009)
### 1.0.1
- Update to v2.0.0 of `@actions/http-client`
### 1.0.2
- Update to v2.0.1 of `@actions/http-client` [#1087](https://github.com/actions/toolkit/pull/1087)
### 1.1.0
- Add `x-actions-results-crc64` and `x-actions-results-md5` checksum headers on upload [#1063](https://github.com/actions/toolkit/pull/1063)
### 1.1.1
- Fixed a bug in Node16 where if an HTTP download finished too quickly (<1ms, e.g. when it's mocked) we attempt to delete a temp file that has not been created yet [#1278](https://github.com/actions/toolkit/pull/1278/commits/b9de68a590daf37c6747e38d3cb4f1dd2cfb791c)
- Initial release

View File

@ -2,18 +2,21 @@ import * as http from 'http'
import * as net from 'net'
import {HttpClient} from '@actions/http-client'
import * as config from '../src/internal/shared/config'
import {createArtifactTwirpClient} from '../src/internal/shared/artifact-twirp-client'
import * as core from '@actions/core'
import {internalArtifactTwirpClient} from '../src/internal/shared/artifact-twirp-client'
import {noopLogs} from './common'
import {NetworkError, UsageError} from '../src/internal/shared/errors'
jest.mock('@actions/http-client')
const clientOptions = {
maxAttempts: 5,
retryIntervalMs: 1,
retryMultiplier: 1.5
}
describe('artifact-http-client', () => {
beforeAll(() => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
noopLogs()
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('http://localhost:8080')
@ -25,7 +28,7 @@ describe('artifact-http-client', () => {
})
it('should successfully create a client', () => {
const client = createArtifactTwirpClient('upload')
const client = internalArtifactTwirpClient()
expect(client).toBeDefined()
})
@ -50,7 +53,7 @@ describe('artifact-http-client', () => {
}
})
const client = createArtifactTwirpClient('upload')
const client = internalArtifactTwirpClient()
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
@ -98,12 +101,55 @@ describe('artifact-http-client', () => {
}
})
const client = createArtifactTwirpClient(
'upload',
5, // retry 5 times
1, // wait 1 ms
1.5 // backoff factor
)
const client = internalArtifactTwirpClient(clientOptions)
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(artifact).toBeDefined()
expect(artifact.ok).toBe(true)
expect(artifact.signedUploadUrl).toBe('http://localhost:8080/upload')
expect(mockPost).toHaveBeenCalledTimes(2)
})
it('should retry if invalid body response', async () => {
const mockPost = jest
.fn(() => {
const msgSucceeded = new http.IncomingMessage(new net.Socket())
msgSucceeded.statusCode = 200
return {
message: msgSucceeded,
readBody: async () => {
return Promise.resolve(
`{"ok": true, "signedUploadUrl": "http://localhost:8080/upload"}`
)
}
}
})
.mockImplementationOnce(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 502
msgFailed.statusMessage = 'Bad Gateway'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve('💥')
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
const artifact = await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
@ -138,12 +184,7 @@ describe('artifact-http-client', () => {
post: mockPost
}
})
const client = createArtifactTwirpClient(
'upload',
5, // retry 5 times
1, // wait 1 ms
1.5 // backoff factor
)
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
@ -178,12 +219,7 @@ describe('artifact-http-client', () => {
post: mockPost
}
})
const client = createArtifactTwirpClient(
'upload',
5, // retry 5 times
1, // wait 1 ms
1.5 // backoff factor
)
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
@ -197,4 +233,116 @@ describe('artifact-http-client', () => {
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should fail with a descriptive error', async () => {
// 409 duplicate error
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 409
msgFailed.statusMessage = 'Conflict'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(
`{"msg": "an artifact with this name already exists on the workflow run"}`
)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient(clientOptions)
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(
'Failed to CreateArtifact: Received non-retryable error: Failed request: (409) Conflict: an artifact with this name already exists on the workflow run'
)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should properly describe a network failure', async () => {
class FakeNodeError extends Error {
code: string
constructor(code: string) {
super()
this.code = code
}
}
const mockPost = jest.fn(() => {
throw new FakeNodeError('ENOTFOUND')
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient()
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(new NetworkError('ENOTFOUND').message)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
it('should properly describe a usage error', async () => {
const mockPost = jest.fn(() => {
const msgFailed = new http.IncomingMessage(new net.Socket())
msgFailed.statusCode = 403
msgFailed.statusMessage = 'Forbidden'
return {
message: msgFailed,
readBody: async () => {
return Promise.resolve(
`{"msg": "insufficient usage to create artifact"}`
)
}
}
})
const mockHttpClient = (
HttpClient as unknown as jest.Mock
).mockImplementation(() => {
return {
post: mockPost
}
})
const client = internalArtifactTwirpClient()
await expect(async () => {
await client.CreateArtifact({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678',
name: 'artifact',
version: 4
})
}).rejects.toThrowError(new UsageError().message)
expect(mockHttpClient).toHaveBeenCalledTimes(1)
expect(mockPost).toHaveBeenCalledTimes(1)
})
})

View File

@ -0,0 +1,9 @@
import * as core from '@actions/core'
// noopLogs mocks the console.log and core.* functions to prevent output in the console while testing
export const noopLogs = (): void => {
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
}

View File

@ -0,0 +1,149 @@
import * as config from '../src/internal/shared/config'
import os from 'os'
// Mock the `cpus()` function in the `os` module
jest.mock('os', () => {
const osActual = jest.requireActual('os')
return {
...osActual,
cpus: jest.fn()
}
})
beforeEach(() => {
jest.resetModules()
})
describe('isGhes', () => {
it('should return false when the request domain is github.com', () => {
process.env.GITHUB_SERVER_URL = 'https://github.com'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with ghe.com', () => {
process.env.GITHUB_SERVER_URL = 'https://my.domain.ghe.com'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with ghe.localhost', () => {
process.env.GITHUB_SERVER_URL = 'https://my.domain.ghe.localhost'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain ends with .localhost', () => {
process.env.GITHUB_SERVER_URL = 'https://github.localhost'
expect(config.isGhes()).toBe(false)
})
it('should return false when the request domain is specific to an enterprise', () => {
process.env.GITHUB_SERVER_URL = 'https://my-enterprise.github.com'
expect(config.isGhes()).toBe(true)
})
})
describe('uploadChunkTimeoutEnv', () => {
it('should return default 300000 when no env set', () => {
expect(config.getUploadChunkTimeout()).toBe(300000)
})
it('should return value set in ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS', () => {
process.env.ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS = '150000'
expect(config.getUploadChunkTimeout()).toBe(150000)
})
it('should throw if value set in ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS is invalid', () => {
process.env.ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS = 'abc'
expect(() => {
config.getUploadChunkTimeout()
}).toThrow()
})
})
describe('uploadConcurrencyEnv', () => {
it('Concurrency default to 5', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
expect(config.getConcurrency()).toBe(5)
})
it('Concurrency max out at 300 on systems with many CPUs', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(32))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '301'
expect(config.getConcurrency()).toBe(300)
})
it('Concurrency can be set to 32 when cpu num is <= 4', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '32'
expect(config.getConcurrency()).toBe(32)
})
it('Concurrency can be set 16 * num of cpu when cpu num is > 4', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(6))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '96'
expect(config.getConcurrency()).toBe(96)
})
it('Concurrency can be overridden by env var ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '10'
expect(config.getConcurrency()).toBe(10)
})
it('should throw with invalid value of ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = 'abc'
expect(() => {
config.getConcurrency()
}).toThrow()
})
it('should throw if ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY is < 1', () => {
;(os.cpus as jest.Mock).mockReturnValue(new Array(4))
process.env.ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY = '0'
expect(() => {
config.getConcurrency()
}).toThrow()
})
})
describe('getMaxArtifactListCount', () => {
beforeEach(() => {
delete process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT
})
it('should return default 1000 when no env set', () => {
expect(config.getMaxArtifactListCount()).toBe(1000)
})
it('should return value set in ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT', () => {
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = '2000'
expect(config.getMaxArtifactListCount()).toBe(2000)
})
it('should throw if value set in ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT is invalid', () => {
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = 'abc'
expect(() => {
config.getMaxArtifactListCount()
}).toThrow(
'Invalid value set for ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT env variable'
)
})
it('should throw if ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT is < 1', () => {
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = '0'
expect(() => {
config.getMaxArtifactListCount()
}).toThrow(
'Invalid value set for ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT env variable'
)
})
it('should throw if ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT is negative', () => {
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = '-100'
expect(() => {
config.getMaxArtifactListCount()
}).toThrow(
'Invalid value set for ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT env variable'
)
})
})

View File

@ -0,0 +1,192 @@
import * as github from '@actions/github'
import type {RestEndpointMethods} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/method-types'
import type {RequestInterface} from '@octokit/types'
import {
deleteArtifactInternal,
deleteArtifactPublic
} from '../src/internal/delete/delete-artifact'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
type MockedDeleteArtifact = jest.MockedFunction<
RestEndpointMethods['actions']['deleteArtifact']
>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn(),
rest: {
actions: {
deleteArtifact: jest.fn()
}
}
})
}))
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('delete-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should delete an artifact', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const mockDeleteArtifact = github.getOctokit(fixtures.token).rest.actions
.deleteArtifact as MockedDeleteArtifact
mockDeleteArtifact.mockResolvedValueOnce({
status: 204,
headers: {},
url: '',
data: null as never
})
const response = await deleteArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
id: fixtures.artifacts[0].id
})
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const mockDeleteArtifact = github.getOctokit(fixtures.token).rest.actions
.deleteArtifact as MockedDeleteArtifact
mockDeleteArtifact.mockRejectedValue(new Error('boom'))
await expect(
deleteArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
).rejects.toThrow('boom')
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should delete an artifact', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'DeleteArtifact')
.mockResolvedValue({
ok: true,
artifactId: fixtures.artifacts[0].id.toString()
})
const response = await deleteArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
id: fixtures.artifacts[0].id
})
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'DeleteArtifact')
.mockRejectedValue(new Error('boom'))
await expect(
deleteArtifactInternal(fixtures.artifacts[0].id)
).rejects.toThrow('boom')
})
})
})

View File

@ -2,14 +2,21 @@ import fs from 'fs'
import * as http from 'http'
import * as net from 'net'
import * as path from 'path'
import * as core from '@actions/core'
import * as github from '@actions/github'
import {HttpClient} from '@actions/http-client'
import type {RestEndpointMethods} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/method-types'
import archiver from 'archiver'
import {downloadArtifact} from '../src/internal/download/download-artifact'
import {
downloadArtifactInternal,
downloadArtifactPublic,
streamExtractExternal
} from '../src/internal/download/download-artifact'
import {getUserAgentString} from '../src/internal/shared/user-agent'
import {noopLogs} from './common'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON} from '../src/generated'
import * as util from '../src/internal/shared/util'
type MockedDownloadArtifact = jest.MockedFunction<
RestEndpointMethods['actions']['downloadArtifact']
@ -32,10 +39,16 @@ const fixtures = {
]
},
artifactID: 1234,
artifactName: 'my-artifact',
artifactSize: 123456,
repositoryOwner: 'actions',
repositoryName: 'toolkit',
token: 'ghp_1234567890',
blobStorageUrl: 'https://blob-storage.local?signed=true'
blobStorageUrl: 'https://blob-storage.local?signed=true',
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
}
}
jest.mock('@actions/github', () => ({
@ -74,206 +87,566 @@ const expectExtractedArchive = async (dir: string): Promise<void> => {
}
}
const setup = async (): Promise<void> => {
noopLogs()
await fs.promises.mkdir(testDir, {recursive: true})
await createTestArchive()
process.env['GITHUB_WORKSPACE'] = fixtures.workspaceDir
}
const cleanup = async (): Promise<void> => {
jest.restoreAllMocks()
await fs.promises.rm(testDir, {recursive: true})
delete process.env['GITHUB_WORKSPACE']
}
const mockGetArtifactSuccess = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(fixtures.exampleArtifact.path))
message.push(null)
return {
message
}
})
const mockGetArtifactHung = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
// Don't push any data or call push(null) to end the stream
// This creates a stream that hangs and never completes
return {
message
}
})
const mockGetArtifactFailure = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 500
message.push('Internal Server Error')
message.push(null)
return {
message
}
})
const mockGetArtifactMalicious = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(path.join(__dirname, 'fixtures', 'evil.zip'))) // evil.zip contains files that are formatted x/../../etc/hosts
message.push(null)
return {
message
}
})
describe('download-artifact', () => {
beforeEach(async () => {
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
describe('public', () => {
beforeEach(setup)
afterEach(cleanup)
await fs.promises.mkdir(testDir, {recursive: true})
await createTestArchive()
it('should successfully download an artifact to $GITHUB_WORKSPACE', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
process.env['GITHUB_WORKSPACE'] = fixtures.workspaceDir
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expectExtractedArchive(fixtures.workspaceDir)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
})
it('should not allow path traversal from malicious artifacts', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactMalicious
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactMalicious).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
// ensure path traversal was not possible
expect(
fs.existsSync(path.join(fixtures.workspaceDir, 'x/etc/hosts'))
).toBe(true)
expect(
fs.existsSync(path.join(fixtures.workspaceDir, 'y/etc/hosts'))
).toBe(true)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
})
it('should successfully download an artifact to user defined path', async () => {
const customPath = path.join(testDir, 'custom')
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token,
{
path: customPath
}
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expectExtractedArchive(customPath)
expect(response.downloadPath).toBe(customPath)
})
it('should fail if download artifact API does not respond with location', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {},
status: 302,
url: '',
data: Buffer.from('')
})
await expect(
downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
})
it('should fail if blob storage storage chunk does not respond within 30s', async () => {
// mock http client to delay response data by 30s
const msg = new http.IncomingMessage(new net.Socket())
msg.statusCode = 200
const mockGet = jest.fn(async () => {
return new Promise((resolve, reject) => {
// Reject with an error after 31 seconds
setTimeout(() => {
reject(new Error('Request timeout'))
}, 31000) // Timeout after 31 seconds
})
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGet
}
}
)
await expect(
streamExtractExternal(fixtures.blobStorageUrl, fixtures.workspaceDir)
).rejects.toBeInstanceOf(Error)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
}, 35000) // add longer timeout to allow for timer to run out
it('should fail if blob storage response is non-200 after 5 retries', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactFailure
}
}
)
await expect(
downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactFailure).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactFailure).toHaveBeenCalledTimes(5)
}, 38000)
it('should retry if blob storage response is non-200 and then succeed with a 200', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest
.actions.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const mockGetArtifact = jest
.fn(mockGetArtifactSuccess)
.mockImplementationOnce(mockGetArtifactFailure)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifact
}
}
)
const response = await downloadArtifactPublic(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockGetArtifactFailure).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactFailure).toHaveBeenCalledTimes(1)
expect(mockGetArtifactSuccess).toHaveBeenCalledWith(
fixtures.blobStorageUrl
)
expect(mockGetArtifactSuccess).toHaveBeenCalledTimes(1)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
}, 28000)
})
afterEach(async () => {
jest.restoreAllMocks()
await fs.promises.rm(testDir, {recursive: true})
delete process.env['GITHUB_WORKSPACE']
})
describe('internal', () => {
beforeEach(async () => {
await setup()
it('should successfully download an artifact to $GITHUB_WORKSPACE', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest.actions
.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
afterEach(async () => {
await cleanup()
})
const getMock = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(fixtures.exampleArtifact.path))
return {
message
}
})
const httpClientMock = (HttpClient as jest.Mock).mockImplementation(() => {
return {
get: getMock
}
it('should successfully download an artifact to $GITHUB_WORKSPACE', async () => {
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
const response = await downloadArtifactInternal(fixtures.artifactID)
expectExtractedArchive(fixtures.workspaceDir)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
const response = await downloadArtifact(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
)
it('should successfully download an artifact to user defined path', async () => {
const customPath = path.join(testDir, 'custom')
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(httpClientMock).toHaveBeenCalledWith(getUserAgentString())
expect(getMock).toHaveBeenCalledWith(fixtures.blobStorageUrl)
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
expectExtractedArchive(fixtures.workspaceDir)
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
expect(response.success).toBe(true)
expect(response.downloadPath).toBe(fixtures.workspaceDir)
})
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactSuccess
}
}
)
it('should successfully download an artifact to user defined path', async () => {
const customPath = path.join(testDir, 'custom')
const downloadArtifactMock = github.getOctokit(fixtures.token).rest.actions
.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
const getMock = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 200
message.push(fs.readFileSync(fixtures.exampleArtifact.path))
return {
message
}
})
const httpClientMock = (HttpClient as jest.Mock).mockImplementation(() => {
return {
get: getMock
}
})
const response = await downloadArtifact(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token,
{
const response = await downloadArtifactInternal(fixtures.artifactID, {
path: customPath
}
)
})
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
})
expect(httpClientMock).toHaveBeenCalledWith(getUserAgentString())
expect(getMock).toHaveBeenCalledWith(fixtures.blobStorageUrl)
expectExtractedArchive(customPath)
expect(response.success).toBe(true)
expect(response.downloadPath).toBe(customPath)
})
it('should fail if download artifact API does not respond with location', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest.actions
.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {},
status: 302,
url: '',
data: Buffer.from('')
expectExtractedArchive(customPath)
expect(response.downloadPath).toBe(customPath)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
await expect(
downloadArtifact(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
it('should fail if download artifact API does not respond with location', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
await expect(
downloadArtifactInternal(fixtures.artifactID)
).rejects.toBeInstanceOf(Error)
})
it('should fail if blob storage response is non-200', async () => {
const mockListArtifacts = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifactID.toString(),
name: fixtures.artifactName,
size: fixtures.artifactSize.toString()
}
]
})
const mockGetSignedArtifactURL = jest
.spyOn(ArtifactServiceClientJSON.prototype, 'GetSignedArtifactURL')
.mockReturnValue(
Promise.resolve({
signedUrl: fixtures.blobStorageUrl
})
)
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockGetArtifactFailure
}
}
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
}
await expect(
downloadArtifactInternal(fixtures.artifactID)
).rejects.toBeInstanceOf(Error)
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockListArtifacts).toHaveBeenCalledWith({
idFilter: {
value: fixtures.artifactID.toString()
},
...fixtures.backendIds
})
expect(mockGetSignedArtifactURL).toHaveBeenCalledWith({
...fixtures.backendIds,
name: fixtures.artifactName
})
})
})
it('should fail if blob storage response is non-200', async () => {
const downloadArtifactMock = github.getOctokit(fixtures.token).rest.actions
.downloadArtifact as MockedDownloadArtifact
downloadArtifactMock.mockResolvedValueOnce({
headers: {
location: fixtures.blobStorageUrl
},
status: 302,
url: '',
data: Buffer.from('')
})
describe('streamExtractExternal', () => {
it('should fail if the timeout is exceeded', async () => {
const mockSlowGetArtifact = jest.fn(mockGetArtifactHung)
const getMock = jest.fn(() => {
const message = new http.IncomingMessage(new net.Socket())
message.statusCode = 500
message.push('Internal Server Error')
return {
message
}
})
const httpClientMock = (HttpClient as jest.Mock).mockImplementation(() => {
return {
get: getMock
}
})
await expect(
downloadArtifact(
fixtures.artifactID,
fixtures.repositoryOwner,
fixtures.repositoryName,
fixtures.token
const mockHttpClient = (HttpClient as jest.Mock).mockImplementation(
() => {
return {
get: mockSlowGetArtifact
}
}
)
).rejects.toBeInstanceOf(Error)
expect(downloadArtifactMock).toHaveBeenCalledWith({
owner: fixtures.repositoryOwner,
repo: fixtures.repositoryName,
artifact_id: fixtures.artifactID,
archive_format: 'zip',
request: {
redirect: 'manual'
try {
await streamExtractExternal(
fixtures.blobStorageUrl,
fixtures.workspaceDir,
{timeout: 2}
)
expect(true).toBe(false) // should not be called
} catch (e) {
expect(e).toBeInstanceOf(Error)
expect(e.message).toContain('did not respond in 2ms')
expect(mockHttpClient).toHaveBeenCalledWith(getUserAgentString())
expect(mockSlowGetArtifact).toHaveBeenCalledTimes(1)
}
})
expect(httpClientMock).toHaveBeenCalledWith(getUserAgentString())
expect(getMock).toHaveBeenCalledWith(fixtures.blobStorageUrl)
})
})

Binary file not shown.

View File

@ -0,0 +1,239 @@
import * as github from '@actions/github'
import type {RequestInterface} from '@octokit/types'
import {
getArtifactInternal,
getArtifactPublic
} from '../src/internal/find/get-artifact'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
import {
ArtifactNotFoundError,
InvalidResponseError
} from '../src/internal/shared/errors'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn()
})
}))
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('get-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should return the artifact if it is found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: [
{
name: fixtures.artifacts[0].name,
id: fixtures.artifacts[0].id,
size_in_bytes: fixtures.artifacts[0].size,
created_at: fixtures.artifacts[0].createdAt.toISOString()
}
]
}
})
const response = await getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
artifact: fixtures.artifacts[0]
})
})
it('should return the latest artifact if multiple are found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: fixtures.artifacts.map(artifact => ({
name: artifact.name,
id: artifact.id,
size_in_bytes: artifact.size,
created_at: artifact.createdAt.toISOString()
}))
}
})
const response = await getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).toEqual({
artifact: fixtures.artifacts[1]
})
})
it('should fail if no artifacts are found', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
artifacts: []
}
})
const response = getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).rejects.toThrowError(ArtifactNotFoundError)
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 404,
headers: {},
url: '',
data: {}
})
const response = getArtifactPublic(
fixtures.artifacts[0].name,
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token
)
expect(response).rejects.toThrowError(InvalidResponseError)
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should return the artifact if it is found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: [
{
...fixtures.backendIds,
databaseId: fixtures.artifacts[0].id.toString(),
name: fixtures.artifacts[0].name,
size: fixtures.artifacts[0].size.toString(),
createdAt: Timestamp.fromDate(fixtures.artifacts[0].createdAt)
}
]
})
const response = await getArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
artifact: fixtures.artifacts[0]
})
})
it('should return the latest artifact if multiple are found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await getArtifactInternal(fixtures.artifacts[0].name)
expect(response).toEqual({
artifact: fixtures.artifacts[1]
})
})
it('should fail if no artifacts are found', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: []
})
const response = getArtifactInternal(fixtures.artifacts[0].name)
expect(response).rejects.toThrowError(ArtifactNotFoundError)
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
const response = getArtifactInternal(fixtures.artifacts[0].name)
expect(response).rejects.toThrow()
})
})
})

View File

@ -0,0 +1,361 @@
import * as github from '@actions/github'
import type {RestEndpointMethodTypes} from '@octokit/plugin-rest-endpoint-methods/dist-types/generated/parameters-and-response-types'
import {
listArtifactsInternal,
listArtifactsPublic
} from '../src/internal/find/list-artifacts'
import * as config from '../src/internal/shared/config'
import {ArtifactServiceClientJSON, Timestamp} from '../src/generated'
import * as util from '../src/internal/shared/util'
import {noopLogs} from './common'
import {Artifact} from '../src/internal/shared/interfaces'
import {RequestInterface} from '@octokit/types'
type MockedRequest = jest.MockedFunction<RequestInterface<object>>
jest.mock('@actions/github', () => ({
getOctokit: jest.fn().mockReturnValue({
request: jest.fn(),
rest: {
actions: {
listWorkflowRunArtifacts: jest.fn()
}
}
})
}))
const artifactsToListResponse = (
artifacts: Artifact[]
): RestEndpointMethodTypes['actions']['listWorkflowRunArtifacts']['response']['data'] => {
return {
total_count: artifacts.length,
artifacts: artifacts.map(artifact => ({
name: artifact.name,
id: artifact.id,
size_in_bytes: artifact.size,
created_at: artifact.createdAt?.toISOString() || '',
run_id: fixtures.runId,
// unused fields for tests
url: '',
archive_download_url: '',
expired: false,
expires_at: '',
node_id: '',
run_url: '',
type: '',
updated_at: ''
}))
}
}
const fixtures = {
repo: 'toolkit',
owner: 'actions',
token: 'ghp_1234567890',
runId: 123,
backendIds: {
workflowRunBackendId: 'c4d7c21f-ba3f-4ddc-a8c8-6f2f626f8422',
workflowJobRunBackendId: '760803a1-f890-4d25-9a6e-a3fc01a0c7cf'
},
artifacts: [
{
id: 1,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-01')
},
{
id: 2,
name: 'my-artifact',
size: 456,
createdAt: new Date('2023-12-02')
}
]
}
describe('list-artifact', () => {
beforeAll(() => {
noopLogs()
})
describe('public', () => {
it('should return a list of artifacts', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: artifactsToListResponse(fixtures.artifacts)
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
expect(response).toEqual({
artifacts: fixtures.artifacts
})
})
it('should return the latest artifact when latest is specified', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: artifactsToListResponse(fixtures.artifacts)
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
true
)
expect(response).toEqual({
artifacts: [fixtures.artifacts[1]]
})
})
it('can return empty artifacts', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
total_count: 0,
artifacts: []
}
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
true
)
expect(response).toEqual({
artifacts: []
})
})
it('should fail if non-200 response', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
mockRequest.mockRejectedValueOnce(new Error('boom'))
await expect(
listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
).rejects.toThrow('boom')
})
it('should handle pagination correctly when fetching multiple pages', async () => {
const mockRequest = github.getOctokit(fixtures.token)
.request as MockedRequest
const manyArtifacts = Array.from({length: 150}, (_, i) => ({
id: i + 1,
name: `artifact-${i + 1}`,
size: 100,
createdAt: new Date('2023-12-01')
}))
mockRequest
.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
...artifactsToListResponse(manyArtifacts.slice(0, 100)),
total_count: 150
}
})
.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
...artifactsToListResponse(manyArtifacts.slice(100, 150)),
total_count: 150
}
})
const response = await listArtifactsPublic(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
// Verify that both API calls were made
expect(mockRequest).toHaveBeenCalledTimes(2)
// Should return all 150 artifacts across both pages
expect(response.artifacts).toHaveLength(150)
// Verify we got artifacts from both pages
expect(response.artifacts[0].name).toBe('artifact-1')
expect(response.artifacts[99].name).toBe('artifact-100')
expect(response.artifacts[100].name).toBe('artifact-101')
expect(response.artifacts[149].name).toBe('artifact-150')
})
it('should respect ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT environment variable', async () => {
const originalEnv = process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = '150'
jest.resetModules()
try {
const {listArtifactsPublic: listArtifactsPublicReloaded} = await import(
'../src/internal/find/list-artifacts'
)
const githubReloaded = await import('@actions/github')
const mockRequest = (githubReloaded.getOctokit as jest.Mock)(
fixtures.token
).request as MockedRequest
const manyArtifacts = Array.from({length: 200}, (_, i) => ({
id: i + 1,
name: `artifact-${i + 1}`,
size: 100,
createdAt: new Date('2023-12-01')
}))
mockRequest
.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
...artifactsToListResponse(manyArtifacts.slice(0, 100)),
total_count: 200
}
})
.mockResolvedValueOnce({
status: 200,
headers: {},
url: '',
data: {
...artifactsToListResponse(manyArtifacts.slice(100, 150)),
total_count: 200
}
})
const response = await listArtifactsPublicReloaded(
fixtures.runId,
fixtures.owner,
fixtures.repo,
fixtures.token,
false
)
// Should only return 150 artifacts due to the limit
expect(response.artifacts).toHaveLength(150)
expect(response.artifacts[0].name).toBe('artifact-1')
expect(response.artifacts[149].name).toBe('artifact-150')
} finally {
// Restore original environment variable
if (originalEnv !== undefined) {
process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT = originalEnv
} else {
delete process.env.ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT
}
// Reset modules again to restore original state
jest.resetModules()
}
})
})
describe('internal', () => {
beforeEach(() => {
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIds)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
})
it('should return a list of artifacts', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await listArtifactsInternal(false)
expect(response).toEqual({
artifacts: fixtures.artifacts
})
})
it('should return the latest artifact when latest is specified', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: fixtures.artifacts.map(artifact => ({
...fixtures.backendIds,
databaseId: artifact.id.toString(),
name: artifact.name,
size: artifact.size.toString(),
createdAt: Timestamp.fromDate(artifact.createdAt)
}))
})
const response = await listArtifactsInternal(true)
expect(response).toEqual({
artifacts: [fixtures.artifacts[1]]
})
})
it('can return empty artifacts', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockResolvedValue({
artifacts: []
})
const response = await listArtifactsInternal(false)
expect(response).toEqual({
artifacts: []
})
})
it('should fail if non-200 response', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'ListArtifacts')
.mockRejectedValue(new Error('boom'))
await expect(listArtifactsInternal(false)).rejects.toThrow('boom')
})
})
})

View File

@ -3,15 +3,11 @@ import {
validateFilePath
} from '../src/internal/upload/path-and-artifact-name-validation'
import * as core from '@actions/core'
import {noopLogs} from './common'
describe('Path and artifact name validation', () => {
beforeAll(() => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
noopLogs()
})
it('Check Artifact Name for any invalid characters', () => {

View File

@ -1,264 +1,173 @@
import * as core from '@actions/core'
import * as uploadZipSpecification from '../src/internal/upload/upload-zip-specification'
import * as zip from '../src/internal/upload/zip'
import * as util from '../src/internal/shared/util'
import * as retention from '../src/internal/upload/retention'
import * as config from '../src/internal/shared/config'
import {Timestamp, ArtifactServiceClientJSON} from '../src/generated'
import {ArtifactServiceClientJSON} from '../src/generated'
import * as blobUpload from '../src/internal/upload/blob-upload'
import {uploadArtifact} from '../src/internal/upload/upload-artifact'
import {noopLogs} from './common'
import {FilesNotFoundError} from '../src/internal/shared/errors'
import {BlockBlobUploadStreamOptions} from '@azure/storage-blob'
import * as fs from 'fs'
import * as path from 'path'
import unzip from 'unzip-stream'
const uploadStreamMock = jest.fn()
const blockBlobClientMock = jest.fn().mockImplementation(() => ({
uploadStream: uploadStreamMock
}))
jest.mock('@azure/storage-blob', () => ({
BlobClient: jest.fn().mockImplementation(() => {
return {
getBlockBlobClient: blockBlobClientMock
}
})
}))
const fixtures = {
uploadDirectory: path.join(__dirname, '_temp', 'plz-upload'),
files: [
{name: 'file1.txt', content: 'test 1 file content'},
{name: 'file2.txt', content: 'test 2 file content'},
{name: 'file3.txt', content: 'test 3 file content'},
{
name: 'real.txt',
content: 'from a symlink'
},
{
name: 'relative.txt',
content: 'from a symlink',
symlink: 'real.txt',
relative: true
},
{
name: 'absolute.txt',
content: 'from a symlink',
symlink: 'real.txt',
relative: false
}
],
backendIDs: {
workflowRunBackendId: '67dbcc20-e851-4452-a7c3-2cc0d2e0ec67',
workflowJobRunBackendId: '5f49179d-3386-4c38-85f7-00f8138facd0'
},
runtimeToken: 'test-token',
resultsServiceURL: 'http://results.local',
inputs: {
artifactName: 'test-artifact',
files: [
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
rootDirectory: '/home/user/files/plz-upload'
}
}
describe('upload-artifact', () => {
beforeAll(() => {
fs.mkdirSync(fixtures.uploadDirectory, {
recursive: true
})
for (const file of fixtures.files) {
if (file.symlink) {
let symlinkPath = file.symlink
if (!file.relative) {
symlinkPath = path.join(fixtures.uploadDirectory, file.symlink)
}
if (!fs.existsSync(path.join(fixtures.uploadDirectory, file.name))) {
fs.symlinkSync(
symlinkPath,
path.join(fixtures.uploadDirectory, file.name),
'file'
)
}
} else {
fs.writeFileSync(
path.join(fixtures.uploadDirectory, file.name),
file.content
)
}
}
})
beforeEach(() => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
noopLogs()
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue(fixtures.backendIDs)
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue(
fixtures.files.map(file => ({
sourcePath: path.join(fixtures.uploadDirectory, file.name),
destinationPath: file.name,
stats: fs.statSync(path.join(fixtures.uploadDirectory, file.name))
}))
)
jest.spyOn(config, 'getRuntimeToken').mockReturnValue(fixtures.runtimeToken)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue(fixtures.resultsServiceURL)
})
afterEach(() => {
jest.restoreAllMocks()
})
it('should successfully upload an artifact', () => {
const mockDate = new Date('2020-01-01')
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue([
{
sourcePath: '/home/user/files/plz-upload/file1.txt',
destinationPath: 'file1.txt'
},
{
sourcePath: '/home/user/files/plz-upload/file2.txt',
destinationPath: 'file2.txt'
},
{
sourcePath: '/home/user/files/plz-upload/dir/file3.txt',
destinationPath: 'dir/file3.txt'
}
])
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest.spyOn(util, 'getBackendIdsFromToken').mockReturnValue({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678'
})
jest
.spyOn(retention, 'getExpiration')
.mockReturnValue(Timestamp.fromDate(mockDate))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.com'
})
)
jest.spyOn(blobUpload, 'uploadZipToBlobStorage').mockReturnValue(
Promise.resolve({
isSuccess: true,
uploadSize: 1234,
md5Hash: 'test-md5-hash'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(Promise.resolve({ok: true, artifactId: '1'}))
// ArtifactHttpClient mocks
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://test-url.com')
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
)
expect(uploadResp).resolves.toEqual({success: true, size: 1234, id: 1})
})
it('should throw an error if the root directory is invalid', () => {
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockImplementation(() => {
throw new Error('Invalid root directory')
})
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
)
expect(uploadResp).rejects.toThrow('Invalid root directory')
})
it('should return false if there are no files to upload', () => {
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
it('should reject if there are no files to upload', async () => {
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockClear()
.mockReturnValue([])
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
expect(uploadResp).resolves.toEqual({success: false})
await expect(uploadResp).rejects.toThrowError(FilesNotFoundError)
})
it('should return false if no backend IDs are found', () => {
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue([
{
sourcePath: '/home/user/files/plz-upload/file1.txt',
destinationPath: 'file1.txt'
},
{
sourcePath: '/home/user/files/plz-upload/file2.txt',
destinationPath: 'file2.txt'
},
{
sourcePath: '/home/user/files/plz-upload/dir/file3.txt',
destinationPath: 'dir/file3.txt'
}
])
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest
.spyOn(util, 'getBackendIdsFromToken')
.mockReturnValue({workflowRunBackendId: '', workflowJobRunBackendId: ''})
it('should reject if no backend IDs are found', async () => {
jest.spyOn(util, 'getBackendIdsFromToken').mockRestore()
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
expect(uploadResp).resolves.toEqual({success: false})
await expect(uploadResp).rejects.toThrow()
})
it('should return false if the creation request fails', () => {
const mockDate = new Date('2020-01-01')
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue([
{
sourcePath: '/home/user/files/plz-upload/file1.txt',
destinationPath: 'file1.txt'
},
{
sourcePath: '/home/user/files/plz-upload/file2.txt',
destinationPath: 'file2.txt'
},
{
sourcePath: '/home/user/files/plz-upload/dir/file3.txt',
destinationPath: 'dir/file3.txt'
}
])
it('should return false if the creation request fails', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest.spyOn(util, 'getBackendIdsFromToken').mockReturnValue({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678'
})
jest
.spyOn(retention, 'getExpiration')
.mockReturnValue(Timestamp.fromDate(mockDate))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(Promise.resolve({ok: false, signedUploadUrl: ''}))
// ArtifactHttpClient mocks
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://test-url.com')
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
expect(uploadResp).resolves.toEqual({success: false})
await expect(uploadResp).rejects.toThrow()
})
it('should return false if blob storage upload is unsuccessful', () => {
const mockDate = new Date('2020-01-01')
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue([
{
sourcePath: '/home/user/files/plz-upload/file1.txt',
destinationPath: 'file1.txt'
},
{
sourcePath: '/home/user/files/plz-upload/file2.txt',
destinationPath: 'file2.txt'
},
{
sourcePath: '/home/user/files/plz-upload/dir/file3.txt',
destinationPath: 'dir/file3.txt'
}
])
it('should return false if blob storage upload is unsuccessful', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest.spyOn(util, 'getBackendIdsFromToken').mockReturnValue({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678'
})
jest
.spyOn(retention, 'getExpiration')
.mockReturnValue(Timestamp.fromDate(mockDate))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
@ -269,59 +178,21 @@ describe('upload-artifact', () => {
)
jest
.spyOn(blobUpload, 'uploadZipToBlobStorage')
.mockReturnValue(Promise.resolve({isSuccess: false}))
// ArtifactHttpClient mocks
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://test-url.com')
.mockReturnValue(Promise.reject(new Error('boom')))
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
expect(uploadResp).resolves.toEqual({success: false})
await expect(uploadResp).rejects.toThrow()
})
it('should return false if finalize artifact fails', () => {
const mockDate = new Date('2020-01-01')
jest
.spyOn(uploadZipSpecification, 'validateRootDirectory')
.mockReturnValue()
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockReturnValue([
{
sourcePath: '/home/user/files/plz-upload/file1.txt',
destinationPath: 'file1.txt'
},
{
sourcePath: '/home/user/files/plz-upload/file2.txt',
destinationPath: 'file2.txt'
},
{
sourcePath: '/home/user/files/plz-upload/dir/file3.txt',
destinationPath: 'dir/file3.txt'
}
])
it('should reject if finalize artifact fails', async () => {
jest
.spyOn(zip, 'createZipUploadStream')
.mockReturnValue(Promise.resolve(new zip.ZipUploadStream(1)))
jest.spyOn(util, 'getBackendIdsFromToken').mockReturnValue({
workflowRunBackendId: '1234',
workflowJobRunBackendId: '5678'
})
jest
.spyOn(retention, 'getExpiration')
.mockReturnValue(Timestamp.fromDate(mockDate))
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
@ -332,31 +203,171 @@ describe('upload-artifact', () => {
)
jest.spyOn(blobUpload, 'uploadZipToBlobStorage').mockReturnValue(
Promise.resolve({
isSuccess: true,
uploadSize: 1234,
md5Hash: 'test-md5-hash'
sha256Hash: 'test-sha256-hash'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(Promise.resolve({ok: false, artifactId: ''}))
// ArtifactHttpClient mocks
jest.spyOn(config, 'getRuntimeToken').mockReturnValue('test-token')
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://test-url.com')
const uploadResp = uploadArtifact(
'test-artifact',
[
'/home/user/files/plz-upload/file1.txt',
'/home/user/files/plz-upload/file2.txt',
'/home/user/files/plz-upload/dir/file3.txt'
],
'/home/user/files/plz-upload'
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
expect(uploadResp).resolves.toEqual({success: false})
await expect(uploadResp).rejects.toThrow()
})
it('should successfully upload an artifact', async () => {
jest
.spyOn(uploadZipSpecification, 'getUploadZipSpecification')
.mockRestore()
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.local'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
artifactId: '1'
})
)
let loadedBytes = 0
const uploadedZip = path.join(
fixtures.uploadDirectory,
'..',
'uploaded.zip'
)
uploadStreamMock.mockImplementation(
async (
stream: NodeJS.ReadableStream,
bufferSize?: number,
maxConcurrency?: number,
options?: BlockBlobUploadStreamOptions
) => {
const {onProgress} = options || {}
if (fs.existsSync(uploadedZip)) {
fs.unlinkSync(uploadedZip)
}
const uploadedZipStream = fs.createWriteStream(uploadedZip)
onProgress?.({loadedBytes: 0})
return new Promise((resolve, reject) => {
stream.on('data', chunk => {
loadedBytes += chunk.length
uploadedZipStream.write(chunk)
onProgress?.({loadedBytes})
})
stream.on('end', () => {
onProgress?.({loadedBytes})
uploadedZipStream.end()
resolve({})
})
stream.on('error', err => {
reject(err)
})
})
}
)
const {id, size, digest} = await uploadArtifact(
fixtures.inputs.artifactName,
fixtures.files.map(file =>
path.join(fixtures.uploadDirectory, file.name)
),
fixtures.uploadDirectory
)
expect(id).toBe(1)
expect(size).toBe(loadedBytes)
expect(digest).toBeDefined()
expect(digest).toHaveLength(64)
const extractedDirectory = path.join(
fixtures.uploadDirectory,
'..',
'extracted'
)
if (fs.existsSync(extractedDirectory)) {
fs.rmdirSync(extractedDirectory, {recursive: true})
}
const extract = new Promise((resolve, reject) => {
fs.createReadStream(uploadedZip)
.pipe(unzip.Extract({path: extractedDirectory}))
.on('close', () => {
resolve(true)
})
.on('error', err => {
reject(err)
})
})
await expect(extract).resolves.toBe(true)
for (const file of fixtures.files) {
const filePath = path.join(extractedDirectory, file.name)
expect(fs.existsSync(filePath)).toBe(true)
expect(fs.readFileSync(filePath, 'utf8')).toBe(file.content)
}
})
it('should throw an error uploading blob chunks get delayed', async () => {
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'CreateArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
signedUploadUrl: 'https://signed-upload-url.local'
})
)
jest
.spyOn(ArtifactServiceClientJSON.prototype, 'FinalizeArtifact')
.mockReturnValue(
Promise.resolve({
ok: true,
artifactId: '1'
})
)
jest
.spyOn(config, 'getResultsServiceUrl')
.mockReturnValue('https://results.local')
jest.spyOn(config, 'getUploadChunkTimeout').mockReturnValue(2_000)
uploadStreamMock.mockImplementation(
async (
stream: NodeJS.ReadableStream,
bufferSize?: number,
maxConcurrency?: number,
options?: BlockBlobUploadStreamOptions
) => {
const {onProgress, abortSignal} = options || {}
onProgress?.({loadedBytes: 0})
return new Promise(resolve => {
abortSignal?.addEventListener('abort', () => {
resolve({})
})
})
}
)
const uploadResp = uploadArtifact(
fixtures.inputs.artifactName,
fixtures.inputs.files,
fixtures.inputs.rootDirectory
)
await expect(uploadResp).rejects.toThrow('Upload progress stalled.')
})
})

View File

@ -1,11 +1,11 @@
import * as io from '../../io/src/io'
import * as path from 'path'
import {promises as fs} from 'fs'
import * as core from '@actions/core'
import {
getUploadZipSpecification,
validateRootDirectory
} from '../src/internal/upload/upload-zip-specification'
import {noopLogs} from './common'
const root = path.join(__dirname, '_temp', 'upload-specification')
const goodItem1Path = path.join(
@ -51,11 +51,7 @@ const artifactFilesToUpload = [
describe('Search', () => {
beforeAll(async () => {
// mock all output so that there is less noise when running tests
jest.spyOn(console, 'log').mockImplementation(() => {})
jest.spyOn(core, 'debug').mockImplementation(() => {})
jest.spyOn(core, 'info').mockImplementation(() => {})
jest.spyOn(core, 'warning').mockImplementation(() => {})
noopLogs()
// clear temp directory
await io.rmRF(root)
@ -309,4 +305,22 @@ describe('Search', () => {
}
}
})
it('Upload Specification - Includes symlinks', async () => {
const targetPath = path.join(root, 'link-dir', 'symlink-me.txt')
await fs.mkdir(path.dirname(targetPath), {recursive: true})
await fs.writeFile(targetPath, 'symlink file content')
const uploadPath = path.join(root, 'upload-dir', 'symlink.txt')
await fs.mkdir(path.dirname(uploadPath), {recursive: true})
await fs.symlink(targetPath, uploadPath, 'file')
const specifications = getUploadZipSpecification([uploadPath], root)
expect(specifications.length).toEqual(1)
expect(specifications[0].sourcePath).toEqual(uploadPath)
expect(specifications[0].destinationPath).toEqual(
path.join('/upload-dir', 'symlink.txt')
)
expect(specifications[0].stats.isSymbolicLink()).toBe(true)
})
})

View File

@ -1,13 +1,14 @@
import * as config from '../src/internal/shared/config'
import * as util from '../src/internal/shared/util'
import {maskSigUrl, maskSecretUrls} from '../src/internal/shared/util'
import {setSecret, debug} from '@actions/core'
export const testRuntimeToken =
'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwic2NwIjoiQWN0aW9ucy5FeGFtcGxlIEFjdGlvbnMuQW5vdGhlckV4YW1wbGU6dGVzdCBBY3Rpb25zLlJlc3VsdHM6Y2U3ZjU0YzctNjFjNy00YWFlLTg4N2YtMzBkYTQ3NWY1ZjFhOmNhMzk1MDg1LTA0MGEtNTI2Yi0yY2U4LWJkYzg1ZjY5Mjc3NCIsImlhdCI6MTUxNjIzOTAyMn0.XYnI_wHPBlUi1mqYveJnnkJhp4dlFjqxzRmISPsqfw8'
describe('get-backend-ids-from-token', () => {
it('should return backend ids when the token is valid', () => {
jest
.spyOn(config, 'getRuntimeToken')
.mockReturnValue(
'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwic2NwIjoiQWN0aW9ucy5FeGFtcGxlIEFjdGlvbnMuQW5vdGhlckV4YW1wbGU6dGVzdCBBY3Rpb25zLlJlc3VsdHM6Y2U3ZjU0YzctNjFjNy00YWFlLTg4N2YtMzBkYTQ3NWY1ZjFhOmNhMzk1MDg1LTA0MGEtNTI2Yi0yY2U4LWJkYzg1ZjY5Mjc3NCIsImlhdCI6MTUxNjIzOTAyMn0.XYnI_wHPBlUi1mqYveJnnkJhp4dlFjqxzRmISPsqfw8'
)
jest.spyOn(config, 'getRuntimeToken').mockReturnValue(testRuntimeToken)
const backendIds = util.getBackendIdsFromToken()
expect(backendIds.workflowRunBackendId).toBe(
@ -60,3 +61,159 @@ describe('get-backend-ids-from-token', () => {
)
})
})
jest.mock('@actions/core')
describe('maskSigUrl', () => {
beforeEach(() => {
jest.clearAllMocks()
})
it('does nothing if no sig parameter is present', () => {
const url = 'https://example.com'
maskSigUrl(url)
expect(setSecret).not.toHaveBeenCalled()
})
it('masks the sig parameter in the middle of the URL and sets it as a secret', () => {
const url = 'https://example.com/?param1=value1&sig=12345&param2=value2'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('12345')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('12345'))
})
it('does nothing if the URL is empty', () => {
const url = ''
maskSigUrl(url)
expect(setSecret).not.toHaveBeenCalled()
})
it('handles URLs with fragments', () => {
const url = 'https://example.com?sig=12345#fragment'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('12345')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('12345'))
})
})
describe('maskSigUrl handles special characters in signatures', () => {
beforeEach(() => {
jest.clearAllMocks()
})
it('handles signatures with slashes', () => {
const url = 'https://example.com/?sig=abc/123'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('abc/123')
expect(setSecret).toHaveBeenCalledWith('abc%2F123')
})
it('handles signatures with plus signs', () => {
const url = 'https://example.com/?sig=abc+123'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('abc 123')
expect(setSecret).toHaveBeenCalledWith('abc%20123')
})
it('handles signatures with equals signs', () => {
const url = 'https://example.com/?sig=abc=123'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('abc=123')
expect(setSecret).toHaveBeenCalledWith('abc%3D123')
})
it('handles already percent-encoded signatures', () => {
const url = 'https://example.com/?sig=abc%2F123%3D'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('abc/123=')
expect(setSecret).toHaveBeenCalledWith('abc%2F123%3D')
})
it('handles complex Azure SAS signatures', () => {
const url =
'https://example.com/container/file.txt?sig=nXyQIUj%2F%2F06Cxt80pBRYiiJlYqtPYg5sz%2FvEh5iHAhw%3D&se=2023-12-31'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith(
'nXyQIUj//06Cxt80pBRYiiJlYqtPYg5sz/vEh5iHAhw='
)
expect(setSecret).toHaveBeenCalledWith(
'nXyQIUj%2F%2F06Cxt80pBRYiiJlYqtPYg5sz%2FvEh5iHAhw%3D'
)
})
it('handles signatures with multiple special characters', () => {
const url = 'https://example.com/?sig=a/b+c=d&e=f'
maskSigUrl(url)
expect(setSecret).toHaveBeenCalledWith('a/b c=d')
expect(setSecret).toHaveBeenCalledWith('a%2Fb%20c%3Dd')
})
})
describe('maskSecretUrls', () => {
beforeEach(() => {
jest.clearAllMocks()
})
it('masks sig parameters in signed_upload_url and signed_url', () => {
const body = {
signed_upload_url: 'https://upload.com?sig=upload123',
signed_url: 'https://download.com?sig=download123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('upload123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('upload123'))
expect(setSecret).toHaveBeenCalledWith('download123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('download123'))
})
it('handles case where only upload_url is present', () => {
const body = {
signed_upload_url: 'https://upload.com?sig=upload123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('upload123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('upload123'))
})
it('handles case where only download_url is present', () => {
const body = {
signed_url: 'https://download.com?sig=download123'
}
maskSecretUrls(body)
expect(setSecret).toHaveBeenCalledWith('download123')
expect(setSecret).toHaveBeenCalledWith(encodeURIComponent('download123'))
})
it('handles case where URLs do not contain sig parameters', () => {
const body = {
signed_upload_url: 'https://upload.com?token=abc',
signed_url: 'https://download.com?token=xyz'
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
})
it('handles empty string URLs', () => {
const body = {
signed_upload_url: '',
signed_url: ''
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
})
it('does nothing if body is not an object or is null', () => {
maskSecretUrls(null)
expect(debug).toHaveBeenCalledWith('body is not an object or is null')
expect(setSecret).not.toHaveBeenCalled()
})
it('does nothing if signed_upload_url and signed_url are not strings', () => {
const body = {
signed_upload_url: 123,
signed_url: 456
}
maskSecretUrls(body)
expect(setSecret).not.toHaveBeenCalled()
})
})

View File

@ -1 +0,0 @@
Docs will be added here once development of version `2.0.0` has finished

View File

@ -0,0 +1,62 @@
# Frequently Asked Questions
- [Frequently Asked Questions](#frequently-asked-questions)
- [Supported Characters](#supported-characters)
- [Compression? ZIP? How is my artifact stored?](#compression-zip-how-is-my-artifact-stored)
- [Which versions of the artifacts packages are compatible?](#which-versions-of-the-artifacts-packages-are-compatible)
- [How long will my artifact be available?](#how-long-will-my-artifact-be-available)
## Supported Characters
When uploading an artifact, the inputted `name` parameter along with the files specified in `files` cannot contain any of the following characters. If they are present in `name` or `files`, the Artifact will be rejected by the server and the upload will fail. These characters are not allowed due to limitations and restrictions with certain file systems such as NTFS. To maintain platform-agnostic behavior, characters that are not supported by an individual filesystem/platform will not be supported on all filesystems/platforms.
- "
- :
- <
- \>
- |
- \*
- ?
In addition to the aforementioned characters, the inputted `name` also cannot include the following
- \
- /
## Compression? ZIP? How is my artifact stored?
When creating an Artifact, the files are dynamically compressed and streamed into a ZIP archive. Since they are stored in a ZIP, they can be compressed by Zlib in varying levels.
The value can range from 0 to 9:
- 0: No compression
- 1: Best speed
- 6: Default compression (same as GNU Gzip)
- 9: Best compression
Higher levels will result in better compression, but will take longer to complete.
For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
## Which versions of the artifacts packages are compatible?
[actions/upload-artifact](https://github.com/actions/upload-artifact) and [actions/download-artifact](https://github.com/actions/download-artifact), leverage [GitHub Actions toolkit](https://github.com/actions/toolkit) and are typically used together to upload and download artifacts in your workflows.
| upload-artifact | download-artifact | toolkit |
|---|---|---|
| v4 | v4 | v2 |
| < v3 | < v3 | < v1 |
Use matching versions of `actions/upload-artifact` and `actions/download-artifact` to ensure compatibility.
In your GitHub Actions workflow YAML file, you specify the version of the actions you want to use. For example:
```yaml
uses: actions/upload-artifact@v4
# ...
uses: actions/download-artifact@v4
# ...
```
**Release Notes:**
Check the release notes for each repository to see if there are any specific notes about compatibility or changes in behavior.
## How long will my artifact be available?
The default retention period is **90 days**. For more information, visit: https://github.com/actions/upload-artifact?tab=readme-ov-file#retention-period

View File

@ -0,0 +1,43 @@
@actions/artifact
# @actions/artifact
## Table of contents
### Classes
- [ArtifactNotFoundError](classes/ArtifactNotFoundError.md)
- [DefaultArtifactClient](classes/DefaultArtifactClient.md)
- [FilesNotFoundError](classes/FilesNotFoundError.md)
- [GHESNotSupportedError](classes/GHESNotSupportedError.md)
- [InvalidResponseError](classes/InvalidResponseError.md)
- [NetworkError](classes/NetworkError.md)
- [UsageError](classes/UsageError.md)
### Interfaces
- [Artifact](interfaces/Artifact.md)
- [ArtifactClient](interfaces/ArtifactClient.md)
- [DeleteArtifactResponse](interfaces/DeleteArtifactResponse.md)
- [DownloadArtifactOptions](interfaces/DownloadArtifactOptions.md)
- [DownloadArtifactResponse](interfaces/DownloadArtifactResponse.md)
- [FindOptions](interfaces/FindOptions.md)
- [GetArtifactResponse](interfaces/GetArtifactResponse.md)
- [ListArtifactsOptions](interfaces/ListArtifactsOptions.md)
- [ListArtifactsResponse](interfaces/ListArtifactsResponse.md)
- [UploadArtifactOptions](interfaces/UploadArtifactOptions.md)
- [UploadArtifactResponse](interfaces/UploadArtifactResponse.md)
### Variables
- [default](README.md#default)
## Variables
### default
`Const` **default**: [`ArtifactClient`](interfaces/ArtifactClient.md)
#### Defined in
[src/artifact.ts:7](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/artifact.ts#L7)

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / ArtifactNotFoundError
# Class: ArtifactNotFoundError
## Hierarchy
- `Error`
**`ArtifactNotFoundError`**
## Table of contents
### Constructors
- [constructor](ArtifactNotFoundError.md#constructor)
### Properties
- [message](ArtifactNotFoundError.md#message)
- [name](ArtifactNotFoundError.md#name)
- [stack](ArtifactNotFoundError.md#stack)
- [prepareStackTrace](ArtifactNotFoundError.md#preparestacktrace)
- [stackTraceLimit](ArtifactNotFoundError.md#stacktracelimit)
### Methods
- [captureStackTrace](ArtifactNotFoundError.md#capturestacktrace)
## Constructors
### constructor
**new ArtifactNotFoundError**(`message?`): [`ArtifactNotFoundError`](ArtifactNotFoundError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `message` | `string` | `'Artifact not found'` |
#### Returns
[`ArtifactNotFoundError`](ArtifactNotFoundError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:24](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L24)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,193 @@
[@actions/artifact](../README.md) / DefaultArtifactClient
# Class: DefaultArtifactClient
The default artifact client that is used by the artifact action(s).
## Implements
- [`ArtifactClient`](../interfaces/ArtifactClient.md)
## Table of contents
### Constructors
- [constructor](DefaultArtifactClient.md#constructor)
### Methods
- [deleteArtifact](DefaultArtifactClient.md#deleteartifact)
- [downloadArtifact](DefaultArtifactClient.md#downloadartifact)
- [getArtifact](DefaultArtifactClient.md#getartifact)
- [listArtifacts](DefaultArtifactClient.md#listartifacts)
- [uploadArtifact](DefaultArtifactClient.md#uploadartifact)
## Constructors
### constructor
**new DefaultArtifactClient**(): [`DefaultArtifactClient`](DefaultArtifactClient.md)
#### Returns
[`DefaultArtifactClient`](DefaultArtifactClient.md)
## Methods
### deleteArtifact
**deleteArtifact**(`artifactName`, `options?`): `Promise`\<[`DeleteArtifactResponse`](../interfaces/DeleteArtifactResponse.md)\>
Delete an Artifact
If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to delete |
| `options?` | [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the delete behavior |
#### Returns
`Promise`\<[`DeleteArtifactResponse`](../interfaces/DeleteArtifactResponse.md)\>
single DeleteArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[deleteArtifact](../interfaces/ArtifactClient.md#deleteartifact)
#### Defined in
[src/internal/client.ts:248](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L248)
___
### downloadArtifact
**downloadArtifact**(`artifactId`, `options?`): `Promise`\<[`DownloadArtifactResponse`](../interfaces/DownloadArtifactResponse.md)\>
Downloads an artifact and unzips the content.
If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactId` | `number` | The id of the artifact to download |
| `options?` | [`DownloadArtifactOptions`](../interfaces/DownloadArtifactOptions.md) & [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the download behavior |
#### Returns
`Promise`\<[`DownloadArtifactResponse`](../interfaces/DownloadArtifactResponse.md)\>
single DownloadArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[downloadArtifact](../interfaces/ArtifactClient.md#downloadartifact)
#### Defined in
[src/internal/client.ts:138](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L138)
___
### getArtifact
**getArtifact**(`artifactName`, `options?`): `Promise`\<[`GetArtifactResponse`](../interfaces/GetArtifactResponse.md)\>
Finds an artifact by name.
If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
If the artifact is not found, it will throw.
If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
`@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to find |
| `options?` | [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the get behavior |
#### Returns
`Promise`\<[`GetArtifactResponse`](../interfaces/GetArtifactResponse.md)\>
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[getArtifact](../interfaces/ArtifactClient.md#getartifact)
#### Defined in
[src/internal/client.ts:212](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L212)
___
### listArtifacts
**listArtifacts**(`options?`): `Promise`\<[`ListArtifactsResponse`](../interfaces/ListArtifactsResponse.md)\>
Lists all artifacts that are part of the current workflow run.
This function will return at most 1000 artifacts per workflow run.
If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `options?` | [`ListArtifactsOptions`](../interfaces/ListArtifactsOptions.md) & [`FindOptions`](../interfaces/FindOptions.md) | Extra options that allow for the customization of the list behavior |
#### Returns
`Promise`\<[`ListArtifactsResponse`](../interfaces/ListArtifactsResponse.md)\>
ListArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[listArtifacts](../interfaces/ArtifactClient.md#listartifacts)
#### Defined in
[src/internal/client.ts:176](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L176)
___
### uploadArtifact
**uploadArtifact**(`name`, `files`, `rootDirectory`, `options?`): `Promise`\<[`UploadArtifactResponse`](../interfaces/UploadArtifactResponse.md)\>
Uploads an artifact.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `name` | `string` | The name of the artifact, required |
| `files` | `string`[] | A list of absolute or relative paths that denote what files should be uploaded |
| `rootDirectory` | `string` | An absolute or relative file path that denotes the root parent directory of the files being uploaded |
| `options?` | [`UploadArtifactOptions`](../interfaces/UploadArtifactOptions.md) | Extra options for customizing the upload behavior |
#### Returns
`Promise`\<[`UploadArtifactResponse`](../interfaces/UploadArtifactResponse.md)\>
single UploadArtifactResponse object
#### Implementation of
[ArtifactClient](../interfaces/ArtifactClient.md).[uploadArtifact](../interfaces/ArtifactClient.md#uploadartifact)
#### Defined in
[src/internal/client.ts:113](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L113)

View File

@ -0,0 +1,180 @@
[@actions/artifact](../README.md) / FilesNotFoundError
# Class: FilesNotFoundError
## Hierarchy
- `Error`
**`FilesNotFoundError`**
## Table of contents
### Constructors
- [constructor](FilesNotFoundError.md#constructor)
### Properties
- [files](FilesNotFoundError.md#files)
- [message](FilesNotFoundError.md#message)
- [name](FilesNotFoundError.md#name)
- [stack](FilesNotFoundError.md#stack)
- [prepareStackTrace](FilesNotFoundError.md#preparestacktrace)
- [stackTraceLimit](FilesNotFoundError.md#stacktracelimit)
### Methods
- [captureStackTrace](FilesNotFoundError.md#capturestacktrace)
## Constructors
### constructor
**new FilesNotFoundError**(`files?`): [`FilesNotFoundError`](FilesNotFoundError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `files` | `string`[] | `[]` |
#### Returns
[`FilesNotFoundError`](FilesNotFoundError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:4](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L4)
## Properties
### files
**files**: `string`[]
#### Defined in
[src/internal/shared/errors.ts:2](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L2)
___
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / GHESNotSupportedError
# Class: GHESNotSupportedError
## Hierarchy
- `Error`
**`GHESNotSupportedError`**
## Table of contents
### Constructors
- [constructor](GHESNotSupportedError.md#constructor)
### Properties
- [message](GHESNotSupportedError.md#message)
- [name](GHESNotSupportedError.md#name)
- [stack](GHESNotSupportedError.md#stack)
- [prepareStackTrace](GHESNotSupportedError.md#preparestacktrace)
- [stackTraceLimit](GHESNotSupportedError.md#stacktracelimit)
### Methods
- [captureStackTrace](GHESNotSupportedError.md#capturestacktrace)
## Constructors
### constructor
**new GHESNotSupportedError**(`message?`): [`GHESNotSupportedError`](GHESNotSupportedError.md)
#### Parameters
| Name | Type | Default value |
| :------ | :------ | :------ |
| `message` | `string` | `'@actions/artifact v2.0.0+, upload-artifact@v4+ and download-artifact@v4+ are not currently supported on GHES.'` |
#### Returns
[`GHESNotSupportedError`](GHESNotSupportedError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:31](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L31)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,169 @@
[@actions/artifact](../README.md) / InvalidResponseError
# Class: InvalidResponseError
## Hierarchy
- `Error`
**`InvalidResponseError`**
## Table of contents
### Constructors
- [constructor](InvalidResponseError.md#constructor)
### Properties
- [message](InvalidResponseError.md#message)
- [name](InvalidResponseError.md#name)
- [stack](InvalidResponseError.md#stack)
- [prepareStackTrace](InvalidResponseError.md#preparestacktrace)
- [stackTraceLimit](InvalidResponseError.md#stacktracelimit)
### Methods
- [captureStackTrace](InvalidResponseError.md#capturestacktrace)
## Constructors
### constructor
**new InvalidResponseError**(`message`): [`InvalidResponseError`](InvalidResponseError.md)
#### Parameters
| Name | Type |
| :------ | :------ |
| `message` | `string` |
#### Returns
[`InvalidResponseError`](InvalidResponseError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:17](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L17)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4

View File

@ -0,0 +1,201 @@
[@actions/artifact](../README.md) / NetworkError
# Class: NetworkError
## Hierarchy
- `Error`
**`NetworkError`**
## Table of contents
### Constructors
- [constructor](NetworkError.md#constructor)
### Properties
- [code](NetworkError.md#code)
- [message](NetworkError.md#message)
- [name](NetworkError.md#name)
- [stack](NetworkError.md#stack)
- [prepareStackTrace](NetworkError.md#preparestacktrace)
- [stackTraceLimit](NetworkError.md#stacktracelimit)
### Methods
- [captureStackTrace](NetworkError.md#capturestacktrace)
- [isNetworkErrorCode](NetworkError.md#isnetworkerrorcode)
## Constructors
### constructor
**new NetworkError**(`code`): [`NetworkError`](NetworkError.md)
#### Parameters
| Name | Type |
| :------ | :------ |
| `code` | `string` |
#### Returns
[`NetworkError`](NetworkError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:42](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L42)
## Properties
### code
**code**: `string`
#### Defined in
[src/internal/shared/errors.ts:40](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L40)
___
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4
___
### isNetworkErrorCode
**isNetworkErrorCode**(`code?`): `boolean`
#### Parameters
| Name | Type |
| :------ | :------ |
| `code?` | `string` |
#### Returns
`boolean`
#### Defined in
[src/internal/shared/errors.ts:49](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L49)

View File

@ -0,0 +1,184 @@
[@actions/artifact](../README.md) / UsageError
# Class: UsageError
## Hierarchy
- `Error`
**`UsageError`**
## Table of contents
### Constructors
- [constructor](UsageError.md#constructor)
### Properties
- [message](UsageError.md#message)
- [name](UsageError.md#name)
- [stack](UsageError.md#stack)
- [prepareStackTrace](UsageError.md#preparestacktrace)
- [stackTraceLimit](UsageError.md#stacktracelimit)
### Methods
- [captureStackTrace](UsageError.md#capturestacktrace)
- [isUsageErrorMessage](UsageError.md#isusageerrormessage)
## Constructors
### constructor
**new UsageError**(): [`UsageError`](UsageError.md)
#### Returns
[`UsageError`](UsageError.md)
#### Overrides
Error.constructor
#### Defined in
[src/internal/shared/errors.ts:62](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L62)
## Properties
### message
**message**: `string`
#### Inherited from
Error.message
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1068
___
### name
**name**: `string`
#### Inherited from
Error.name
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1067
___
### stack
`Optional` **stack**: `string`
#### Inherited from
Error.stack
#### Defined in
node_modules/typescript/lib/lib.es5.d.ts:1069
___
### prepareStackTrace
`Static` `Optional` **prepareStackTrace**: (`err`: `Error`, `stackTraces`: `CallSite`[]) => `any`
#### Type declaration
▸ (`err`, `stackTraces`): `any`
Optional override for formatting stack traces
##### Parameters
| Name | Type |
| :------ | :------ |
| `err` | `Error` |
| `stackTraces` | `CallSite`[] |
##### Returns
`any`
**`See`**
https://v8.dev/docs/stack-trace-api#customizing-stack-traces
#### Inherited from
Error.prepareStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:11
___
### stackTraceLimit
`Static` **stackTraceLimit**: `number`
#### Inherited from
Error.stackTraceLimit
#### Defined in
node_modules/@types/node/globals.d.ts:13
## Methods
### captureStackTrace
**captureStackTrace**(`targetObject`, `constructorOpt?`): `void`
Create .stack property on a target object
#### Parameters
| Name | Type |
| :------ | :------ |
| `targetObject` | `object` |
| `constructorOpt?` | `Function` |
#### Returns
`void`
#### Inherited from
Error.captureStackTrace
#### Defined in
node_modules/@types/node/globals.d.ts:4
___
### isUsageErrorMessage
**isUsageErrorMessage**(`msg?`): `boolean`
#### Parameters
| Name | Type |
| :------ | :------ |
| `msg?` | `string` |
#### Returns
`boolean`
#### Defined in
[src/internal/shared/errors.ts:68](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/errors.ts#L68)

View File

@ -0,0 +1,62 @@
[@actions/artifact](../README.md) / Artifact
# Interface: Artifact
An Actions Artifact
## Table of contents
### Properties
- [createdAt](Artifact.md#createdat)
- [id](Artifact.md#id)
- [name](Artifact.md#name)
- [size](Artifact.md#size)
## Properties
### createdAt
`Optional` **createdAt**: `Date`
The time when the artifact was created
#### Defined in
[src/internal/shared/interfaces.ts:128](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L128)
___
### id
**id**: `number`
The ID of the artifact
#### Defined in
[src/internal/shared/interfaces.ts:118](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L118)
___
### name
**name**: `string`
The name of the artifact
#### Defined in
[src/internal/shared/interfaces.ts:113](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L113)
___
### size
**size**: `number`
The size of the artifact in bytes
#### Defined in
[src/internal/shared/interfaces.ts:123](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L123)

View File

@ -0,0 +1,159 @@
[@actions/artifact](../README.md) / ArtifactClient
# Interface: ArtifactClient
Generic interface for the artifact client.
## Implemented by
- [`DefaultArtifactClient`](../classes/DefaultArtifactClient.md)
## Table of contents
### Methods
- [deleteArtifact](ArtifactClient.md#deleteartifact)
- [downloadArtifact](ArtifactClient.md#downloadartifact)
- [getArtifact](ArtifactClient.md#getartifact)
- [listArtifacts](ArtifactClient.md#listartifacts)
- [uploadArtifact](ArtifactClient.md#uploadartifact)
## Methods
### deleteArtifact
**deleteArtifact**(`artifactName`, `options?`): `Promise`\<[`DeleteArtifactResponse`](DeleteArtifactResponse.md)\>
Delete an Artifact
If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to delete |
| `options?` | [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the delete behavior |
#### Returns
`Promise`\<[`DeleteArtifactResponse`](DeleteArtifactResponse.md)\>
single DeleteArtifactResponse object
#### Defined in
[src/internal/client.ts:103](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L103)
___
### downloadArtifact
**downloadArtifact**(`artifactId`, `options?`): `Promise`\<[`DownloadArtifactResponse`](DownloadArtifactResponse.md)\>
Downloads an artifact and unzips the content.
If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactId` | `number` | The id of the artifact to download |
| `options?` | [`DownloadArtifactOptions`](DownloadArtifactOptions.md) & [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the download behavior |
#### Returns
`Promise`\<[`DownloadArtifactResponse`](DownloadArtifactResponse.md)\>
single DownloadArtifactResponse object
#### Defined in
[src/internal/client.ts:89](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L89)
___
### getArtifact
**getArtifact**(`artifactName`, `options?`): `Promise`\<[`GetArtifactResponse`](GetArtifactResponse.md)\>
Finds an artifact by name.
If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
If the artifact is not found, it will throw.
If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
`@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `artifactName` | `string` | The name of the artifact to find |
| `options?` | [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the get behavior |
#### Returns
`Promise`\<[`GetArtifactResponse`](GetArtifactResponse.md)\>
#### Defined in
[src/internal/client.ts:75](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L75)
___
### listArtifacts
**listArtifacts**(`options?`): `Promise`\<[`ListArtifactsResponse`](ListArtifactsResponse.md)\>
Lists all artifacts that are part of the current workflow run.
This function will return at most 1000 artifacts per workflow run.
If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `options?` | [`ListArtifactsOptions`](ListArtifactsOptions.md) & [`FindOptions`](FindOptions.md) | Extra options that allow for the customization of the list behavior |
#### Returns
`Promise`\<[`ListArtifactsResponse`](ListArtifactsResponse.md)\>
ListArtifactResponse object
#### Defined in
[src/internal/client.ts:57](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L57)
___
### uploadArtifact
**uploadArtifact**(`name`, `files`, `rootDirectory`, `options?`): `Promise`\<[`UploadArtifactResponse`](UploadArtifactResponse.md)\>
Uploads an artifact.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `name` | `string` | The name of the artifact, required |
| `files` | `string`[] | A list of absolute or relative paths that denote what files should be uploaded |
| `rootDirectory` | `string` | An absolute or relative file path that denotes the root parent directory of the files being uploaded |
| `options?` | [`UploadArtifactOptions`](UploadArtifactOptions.md) | Extra options for customizing the upload behavior |
#### Returns
`Promise`\<[`UploadArtifactResponse`](UploadArtifactResponse.md)\>
single UploadArtifactResponse object
#### Defined in
[src/internal/client.ts:40](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/client.ts#L40)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DeleteArtifactResponse
# Interface: DeleteArtifactResponse
Response from the server when deleting an artifact
## Table of contents
### Properties
- [id](DeleteArtifactResponse.md#id)
## Properties
### id
**id**: `number`
The id of the artifact that was deleted
#### Defined in
[src/internal/shared/interfaces.ts:163](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L163)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DownloadArtifactOptions
# Interface: DownloadArtifactOptions
Options for downloading an artifact
## Table of contents
### Properties
- [path](DownloadArtifactOptions.md#path)
## Properties
### path
`Optional` **path**: `string`
Denotes where the artifact will be downloaded to. If not specified then the artifact is download to GITHUB_WORKSPACE
#### Defined in
[src/internal/shared/interfaces.ts:103](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L103)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / DownloadArtifactResponse
# Interface: DownloadArtifactResponse
Response from the server when downloading an artifact
## Table of contents
### Properties
- [downloadPath](DownloadArtifactResponse.md#downloadpath)
## Properties
### downloadPath
`Optional` **downloadPath**: `string`
The path where the artifact was downloaded to
#### Defined in
[src/internal/shared/interfaces.ts:93](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L93)

View File

@ -0,0 +1,30 @@
[@actions/artifact](../README.md) / FindOptions
# Interface: FindOptions
## Table of contents
### Properties
- [findBy](FindOptions.md#findby)
## Properties
### findBy
`Optional` **findBy**: `Object`
The criteria for finding Artifact(s) out of the scope of the current run.
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `repositoryName` | `string` | Repository owner (eg. 'toolkit') |
| `repositoryOwner` | `string` | Repository owner (eg. 'actions') |
| `token` | `string` | Token with actions:read permissions |
| `workflowRunId` | `number` | WorkflowRun of the artifact(s) to lookup |
#### Defined in
[src/internal/shared/interfaces.ts:136](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L136)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / GetArtifactResponse
# Interface: GetArtifactResponse
Response from the server when getting an artifact
## Table of contents
### Properties
- [artifact](GetArtifactResponse.md#artifact)
## Properties
### artifact
**artifact**: [`Artifact`](Artifact.md)
Metadata about the artifact that was found
#### Defined in
[src/internal/shared/interfaces.ts:62](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L62)

View File

@ -0,0 +1,24 @@
[@actions/artifact](../README.md) / ListArtifactsOptions
# Interface: ListArtifactsOptions
Options for listing artifacts
## Table of contents
### Properties
- [latest](ListArtifactsOptions.md#latest)
## Properties
### latest
`Optional` **latest**: `boolean`
Filter the workflow run's artifacts to the latest by name
In the case of reruns, this can be useful to avoid duplicates
#### Defined in
[src/internal/shared/interfaces.ts:73](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L73)

View File

@ -0,0 +1,23 @@
[@actions/artifact](../README.md) / ListArtifactsResponse
# Interface: ListArtifactsResponse
Response from the server when listing artifacts
## Table of contents
### Properties
- [artifacts](ListArtifactsResponse.md#artifacts)
## Properties
### artifacts
**artifacts**: [`Artifact`](Artifact.md)[]
A list of artifacts that were found
#### Defined in
[src/internal/shared/interfaces.ts:83](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L83)

View File

@ -0,0 +1,55 @@
[@actions/artifact](../README.md) / UploadArtifactOptions
# Interface: UploadArtifactOptions
Options for uploading an artifact
## Table of contents
### Properties
- [compressionLevel](UploadArtifactOptions.md#compressionlevel)
- [retentionDays](UploadArtifactOptions.md#retentiondays)
## Properties
### compressionLevel
`Optional` **compressionLevel**: `number`
The level of compression for Zlib to be applied to the artifact archive.
The value can range from 0 to 9:
- 0: No compression
- 1: Best speed
- 6: Default compression (same as GNU Gzip)
- 9: Best compression
Higher levels will result in better compression, but will take longer to complete.
For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
#### Defined in
[src/internal/shared/interfaces.ts:52](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L52)
___
### retentionDays
`Optional` **retentionDays**: `number`
Duration after which artifact will expire in days.
By default artifact expires after 90 days:
https://docs.github.com/en/actions/configuring-and-managing-workflows/persisting-workflow-data-using-artifacts#downloading-and-deleting-artifacts-after-a-workflow-run-is-complete
Use this option to override the default expiry.
Min value: 1
Max value: 90 unless changed by repository setting
If this is set to a greater value than the retention settings allowed, the retention on artifacts
will be reduced to match the max value allowed on server, and the upload process will continue. An
input of 0 assumes default retention setting.
#### Defined in
[src/internal/shared/interfaces.ts:41](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L41)

View File

@ -0,0 +1,50 @@
[@actions/artifact](../README.md) / UploadArtifactResponse
# Interface: UploadArtifactResponse
Response from the server when an artifact is uploaded
## Table of contents
### Properties
- [digest](UploadArtifactResponse.md#digest)
- [id](UploadArtifactResponse.md#id)
- [size](UploadArtifactResponse.md#size)
## Properties
### digest
`Optional` **digest**: `string`
The SHA256 digest of the artifact that was created. Not provided if no artifact was uploaded
#### Defined in
[src/internal/shared/interfaces.ts:19](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L19)
___
### id
`Optional` **id**: `number`
The id of the artifact that was created. Not provided if no artifact was uploaded
This ID can be used as input to other APIs to download, delete or get more information about an artifact: https://docs.github.com/en/rest/actions/artifacts
#### Defined in
[src/internal/shared/interfaces.ts:14](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L14)
___
### size
`Optional` **size**: `number`
Total size of the artifact in bytes. Not provided if no artifact was uploaded
#### Defined in
[src/internal/shared/interfaces.ts:8](https://github.com/actions/toolkit/blob/f522fdf/packages/artifact/src/internal/shared/interfaces.ts#L8)

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{
"name": "@actions/artifact",
"version": "2.0.0",
"version": "4.0.0",
"preview": true,
"description": "Actions artifact lib",
"keywords": [
@ -30,33 +30,40 @@
},
"scripts": {
"audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json",
"test": "echo \"Error: run tests from root\" && exit 1",
"test": "cd ../../ && npm run test ./packages/artifact",
"bootstrap": "cd ../../ && npm run bootstrap",
"tsc-run": "tsc",
"tsc": "npm run bootstrap && npm run tsc-run"
"tsc": "npm run bootstrap && npm run tsc-run",
"gen:docs": "typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"
},
"bugs": {
"url": "https://github.com/actions/toolkit/issues"
},
"dependencies": {
"@actions/core": "^1.10.0",
"@actions/github": "^5.1.1",
"@actions/github": "^6.0.1",
"@actions/http-client": "^2.1.0",
"@azure/core-http": "^3.0.5",
"@azure/storage-blob": "^12.15.0",
"@octokit/core": "^3.5.1",
"@octokit/core": "^5.2.1",
"@octokit/plugin-request-log": "^1.0.4",
"@octokit/plugin-retry": "^3.0.9",
"@octokit/request-error": "^5.0.0",
"@octokit/request": "^8.4.1",
"@octokit/request-error": "^5.1.1",
"@protobuf-ts/plugin": "^2.2.3-alpha.1",
"@types/unzipper": "^0.10.6",
"archiver": "^5.3.1",
"crypto": "^1.0.1",
"archiver": "^7.0.1",
"jwt-decode": "^3.1.2",
"twirp-ts": "^2.5.0",
"unzipper": "^0.10.14"
"unzip-stream": "^0.3.1"
},
"devDependencies": {
"@types/archiver": "^5.3.2",
"typescript": "^4.3.0"
"@types/unzip-stream": "^0.3.4",
"typedoc": "^0.28.13",
"typedoc-plugin-markdown": "^3.17.1",
"typescript": "^5.2.2"
},
"overrides": {
"uri-js": "npm:uri-js-replace@^1.0.1",
"node-fetch": "^3.3.2"
}
}

View File

@ -1,11 +1,8 @@
import {ArtifactClient, Client} from './internal/client'
import {ArtifactClient, DefaultArtifactClient} from './internal/client'
/**
* Exported functionality that we want to expose for any users of @actions/artifact
*/
export * from './internal/shared/interfaces'
export {ArtifactClient}
export * from './internal/shared/errors'
export * from './internal/client'
export function create(): ArtifactClient {
return Client.create()
}
const client: ArtifactClient = new DefaultArtifactClient()
export default client

View File

@ -1,4 +1,4 @@
export * from './google/protobuf/timestamp'
export * from './google/protobuf/wrappers'
export * from './results/api/v1/artifact'
export * from './results/api/v1/artifact.twirp'
export * from './results/api/v1/artifact.twirp-client'

View File

@ -12,8 +12,69 @@ import type { PartialMessage } from "@protobuf-ts/runtime";
import { reflectionMergePartial } from "@protobuf-ts/runtime";
import { MESSAGE_TYPE } from "@protobuf-ts/runtime";
import { MessageType } from "@protobuf-ts/runtime";
import { Int64Value } from "../../../google/protobuf/wrappers";
import { StringValue } from "../../../google/protobuf/wrappers";
import { Timestamp } from "../../../google/protobuf/timestamp";
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export interface MigrateArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: google.protobuf.Timestamp expires_at = 3;
*/
expiresAt?: Timestamp;
}
/**
* @generated from protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export interface MigrateArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: string signed_upload_url = 2;
*/
signedUploadUrl: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export interface FinalizeMigratedArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string name = 2;
*/
name: string;
/**
* @generated from protobuf field: int64 size = 3;
*/
size: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export interface FinalizeMigratedArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: int64 artifact_id = 2;
*/
artifactId: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.CreateArtifactRequest
*/
@ -90,6 +151,377 @@ export interface FinalizeArtifactResponse {
*/
artifactId: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.ListArtifactsRequest
*/
export interface ListArtifactsRequest {
/**
* The backend plan ID
*
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* The backend job ID
*
* @generated from protobuf field: string workflow_job_run_backend_id = 2;
*/
workflowJobRunBackendId: string;
/**
* Name of the artifact to filter on
*
* @generated from protobuf field: google.protobuf.StringValue name_filter = 3;
*/
nameFilter?: StringValue; // optional
/**
* Monolith Database ID of the artifact to filter on
*
* @generated from protobuf field: google.protobuf.Int64Value id_filter = 4;
*/
idFilter?: Int64Value; // optional
}
/**
* @generated from protobuf message github.actions.results.api.v1.ListArtifactsResponse
*/
export interface ListArtifactsResponse {
/**
* @generated from protobuf field: repeated github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact artifacts = 1;
*/
artifacts: ListArtifactsResponse_MonolithArtifact[];
}
/**
* @generated from protobuf message github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact
*/
export interface ListArtifactsResponse_MonolithArtifact {
/**
* The backend plan ID
*
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* The backend job ID
*
* @generated from protobuf field: string workflow_job_run_backend_id = 2;
*/
workflowJobRunBackendId: string;
/**
* Monolith database ID of the artifact
*
* @generated from protobuf field: int64 database_id = 3;
*/
databaseId: string;
/**
* Name of the artifact
*
* @generated from protobuf field: string name = 4;
*/
name: string;
/**
* Size of the artifact in bytes
*
* @generated from protobuf field: int64 size = 5;
*/
size: string;
/**
* When the artifact was created in the monolith
*
* @generated from protobuf field: google.protobuf.Timestamp created_at = 6;
*/
createdAt?: Timestamp;
/**
* The SHA-256 digest of the artifact, calculated on upload for upload-artifact v4 & newer
*
* @generated from protobuf field: google.protobuf.StringValue digest = 7;
*/
digest?: StringValue;
}
/**
* @generated from protobuf message github.actions.results.api.v1.GetSignedArtifactURLRequest
*/
export interface GetSignedArtifactURLRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string workflow_job_run_backend_id = 2;
*/
workflowJobRunBackendId: string;
/**
* @generated from protobuf field: string name = 3;
*/
name: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.GetSignedArtifactURLResponse
*/
export interface GetSignedArtifactURLResponse {
/**
* @generated from protobuf field: string signed_url = 1;
*/
signedUrl: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.DeleteArtifactRequest
*/
export interface DeleteArtifactRequest {
/**
* @generated from protobuf field: string workflow_run_backend_id = 1;
*/
workflowRunBackendId: string;
/**
* @generated from protobuf field: string workflow_job_run_backend_id = 2;
*/
workflowJobRunBackendId: string;
/**
* @generated from protobuf field: string name = 3;
*/
name: string;
}
/**
* @generated from protobuf message github.actions.results.api.v1.DeleteArtifactResponse
*/
export interface DeleteArtifactResponse {
/**
* @generated from protobuf field: bool ok = 1;
*/
ok: boolean;
/**
* @generated from protobuf field: int64 artifact_id = 2;
*/
artifactId: string;
}
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactRequest$Type extends MessageType<MigrateArtifactRequest> {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "expires_at", kind: "message", T: () => Timestamp }
]);
}
create(value?: PartialMessage<MigrateArtifactRequest>): MigrateArtifactRequest {
const message = { workflowRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<MigrateArtifactRequest>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactRequest): MigrateArtifactRequest {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* google.protobuf.Timestamp expires_at */ 3:
message.expiresAt = Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.expiresAt);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: MigrateArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, WireType.LengthDelimited).string(message.name);
/* google.protobuf.Timestamp expires_at = 3; */
if (message.expiresAt)
Timestamp.internalBinaryWrite(message.expiresAt, writer.tag(3, WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
export const MigrateArtifactRequest = new MigrateArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactResponse$Type extends MessageType<MigrateArtifactResponse> {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "signed_upload_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value?: PartialMessage<MigrateArtifactResponse>): MigrateArtifactResponse {
const message = { ok: false, signedUploadUrl: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<MigrateArtifactResponse>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: MigrateArtifactResponse): MigrateArtifactResponse {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* string signed_upload_url */ 2:
message.signedUploadUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: MigrateArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, WireType.Varint).bool(message.ok);
/* string signed_upload_url = 2; */
if (message.signedUploadUrl !== "")
writer.tag(2, WireType.LengthDelimited).string(message.signedUploadUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
export const MigrateArtifactResponse = new MigrateArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactRequest$Type extends MessageType<FinalizeMigratedArtifactRequest> {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value?: PartialMessage<FinalizeMigratedArtifactRequest>): FinalizeMigratedArtifactRequest {
const message = { workflowRunBackendId: "", name: "", size: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<FinalizeMigratedArtifactRequest>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactRequest): FinalizeMigratedArtifactRequest {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* int64 size */ 3:
message.size = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: FinalizeMigratedArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, WireType.LengthDelimited).string(message.name);
/* int64 size = 3; */
if (message.size !== "0")
writer.tag(3, WireType.Varint).int64(message.size);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
export const FinalizeMigratedArtifactRequest = new FinalizeMigratedArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactResponse$Type extends MessageType<FinalizeMigratedArtifactResponse> {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value?: PartialMessage<FinalizeMigratedArtifactResponse>): FinalizeMigratedArtifactResponse {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<FinalizeMigratedArtifactResponse>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: FinalizeMigratedArtifactResponse): FinalizeMigratedArtifactResponse {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: FinalizeMigratedArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
export const FinalizeMigratedArtifactResponse = new FinalizeMigratedArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class CreateArtifactRequest$Type extends MessageType<CreateArtifactRequest> {
constructor() {
@ -348,10 +780,442 @@ class FinalizeArtifactResponse$Type extends MessageType<FinalizeArtifactResponse
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeArtifactResponse
*/
export const FinalizeArtifactResponse = new FinalizeArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class ListArtifactsRequest$Type extends MessageType<ListArtifactsRequest> {
constructor() {
super("github.actions.results.api.v1.ListArtifactsRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "name_filter", kind: "message", T: () => StringValue },
{ no: 4, name: "id_filter", kind: "message", T: () => Int64Value }
]);
}
create(value?: PartialMessage<ListArtifactsRequest>): ListArtifactsRequest {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<ListArtifactsRequest>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: ListArtifactsRequest): ListArtifactsRequest {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* google.protobuf.StringValue name_filter */ 3:
message.nameFilter = StringValue.internalBinaryRead(reader, reader.uint32(), options, message.nameFilter);
break;
case /* google.protobuf.Int64Value id_filter */ 4:
message.idFilter = Int64Value.internalBinaryRead(reader, reader.uint32(), options, message.idFilter);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: ListArtifactsRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* google.protobuf.StringValue name_filter = 3; */
if (message.nameFilter)
StringValue.internalBinaryWrite(message.nameFilter, writer.tag(3, WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.Int64Value id_filter = 4; */
if (message.idFilter)
Int64Value.internalBinaryWrite(message.idFilter, writer.tag(4, WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.ListArtifactsRequest
*/
export const ListArtifactsRequest = new ListArtifactsRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class ListArtifactsResponse$Type extends MessageType<ListArtifactsResponse> {
constructor() {
super("github.actions.results.api.v1.ListArtifactsResponse", [
{ no: 1, name: "artifacts", kind: "message", repeat: 1 /*RepeatType.PACKED*/, T: () => ListArtifactsResponse_MonolithArtifact }
]);
}
create(value?: PartialMessage<ListArtifactsResponse>): ListArtifactsResponse {
const message = { artifacts: [] };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<ListArtifactsResponse>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: ListArtifactsResponse): ListArtifactsResponse {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* repeated github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact artifacts */ 1:
message.artifacts.push(ListArtifactsResponse_MonolithArtifact.internalBinaryRead(reader, reader.uint32(), options));
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: ListArtifactsResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* repeated github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact artifacts = 1; */
for (let i = 0; i < message.artifacts.length; i++)
ListArtifactsResponse_MonolithArtifact.internalBinaryWrite(message.artifacts[i], writer.tag(1, WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.ListArtifactsResponse
*/
export const ListArtifactsResponse = new ListArtifactsResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class ListArtifactsResponse_MonolithArtifact$Type extends MessageType<ListArtifactsResponse_MonolithArtifact> {
constructor() {
super("github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "database_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 4, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 5, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 6, name: "created_at", kind: "message", T: () => Timestamp },
{ no: 7, name: "digest", kind: "message", T: () => StringValue }
]);
}
create(value?: PartialMessage<ListArtifactsResponse_MonolithArtifact>): ListArtifactsResponse_MonolithArtifact {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "", databaseId: "0", name: "", size: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<ListArtifactsResponse_MonolithArtifact>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: ListArtifactsResponse_MonolithArtifact): ListArtifactsResponse_MonolithArtifact {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* int64 database_id */ 3:
message.databaseId = reader.int64().toString();
break;
case /* string name */ 4:
message.name = reader.string();
break;
case /* int64 size */ 5:
message.size = reader.int64().toString();
break;
case /* google.protobuf.Timestamp created_at */ 6:
message.createdAt = Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.createdAt);
break;
case /* google.protobuf.StringValue digest */ 7:
message.digest = StringValue.internalBinaryRead(reader, reader.uint32(), options, message.digest);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: ListArtifactsResponse_MonolithArtifact, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* int64 database_id = 3; */
if (message.databaseId !== "0")
writer.tag(3, WireType.Varint).int64(message.databaseId);
/* string name = 4; */
if (message.name !== "")
writer.tag(4, WireType.LengthDelimited).string(message.name);
/* int64 size = 5; */
if (message.size !== "0")
writer.tag(5, WireType.Varint).int64(message.size);
/* google.protobuf.Timestamp created_at = 6; */
if (message.createdAt)
Timestamp.internalBinaryWrite(message.createdAt, writer.tag(6, WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.StringValue digest = 7; */
if (message.digest)
StringValue.internalBinaryWrite(message.digest, writer.tag(7, WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.ListArtifactsResponse.MonolithArtifact
*/
export const ListArtifactsResponse_MonolithArtifact = new ListArtifactsResponse_MonolithArtifact$Type();
// @generated message type with reflection information, may provide speed optimized methods
class GetSignedArtifactURLRequest$Type extends MessageType<GetSignedArtifactURLRequest> {
constructor() {
super("github.actions.results.api.v1.GetSignedArtifactURLRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value?: PartialMessage<GetSignedArtifactURLRequest>): GetSignedArtifactURLRequest {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<GetSignedArtifactURLRequest>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: GetSignedArtifactURLRequest): GetSignedArtifactURLRequest {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* string name */ 3:
message.name = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: GetSignedArtifactURLRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* string name = 3; */
if (message.name !== "")
writer.tag(3, WireType.LengthDelimited).string(message.name);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.GetSignedArtifactURLRequest
*/
export const GetSignedArtifactURLRequest = new GetSignedArtifactURLRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class GetSignedArtifactURLResponse$Type extends MessageType<GetSignedArtifactURLResponse> {
constructor() {
super("github.actions.results.api.v1.GetSignedArtifactURLResponse", [
{ no: 1, name: "signed_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value?: PartialMessage<GetSignedArtifactURLResponse>): GetSignedArtifactURLResponse {
const message = { signedUrl: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<GetSignedArtifactURLResponse>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: GetSignedArtifactURLResponse): GetSignedArtifactURLResponse {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string signed_url */ 1:
message.signedUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: GetSignedArtifactURLResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string signed_url = 1; */
if (message.signedUrl !== "")
writer.tag(1, WireType.LengthDelimited).string(message.signedUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.GetSignedArtifactURLResponse
*/
export const GetSignedArtifactURLResponse = new GetSignedArtifactURLResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactRequest$Type extends MessageType<DeleteArtifactRequest> {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "workflow_job_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value?: PartialMessage<DeleteArtifactRequest>): DeleteArtifactRequest {
const message = { workflowRunBackendId: "", workflowJobRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<DeleteArtifactRequest>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: DeleteArtifactRequest): DeleteArtifactRequest {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string workflow_job_run_backend_id */ 2:
message.workflowJobRunBackendId = reader.string();
break;
case /* string name */ 3:
message.name = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: DeleteArtifactRequest, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string workflow_job_run_backend_id = 2; */
if (message.workflowJobRunBackendId !== "")
writer.tag(2, WireType.LengthDelimited).string(message.workflowJobRunBackendId);
/* string name = 3; */
if (message.name !== "")
writer.tag(3, WireType.LengthDelimited).string(message.name);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactRequest
*/
export const DeleteArtifactRequest = new DeleteArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class DeleteArtifactResponse$Type extends MessageType<DeleteArtifactResponse> {
constructor() {
super("github.actions.results.api.v1.DeleteArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value?: PartialMessage<DeleteArtifactResponse>): DeleteArtifactResponse {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
reflectionMergePartial<DeleteArtifactResponse>(this, message, value);
return message;
}
internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: DeleteArtifactResponse): DeleteArtifactResponse {
let message = target ?? this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message: DeleteArtifactResponse, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.DeleteArtifactResponse
*/
export const DeleteArtifactResponse = new DeleteArtifactResponse$Type();
/**
* @generated ServiceType for protobuf service github.actions.results.api.v1.ArtifactService
*/
export const ArtifactService = new ServiceType("github.actions.results.api.v1.ArtifactService", [
{ name: "CreateArtifact", options: {}, I: CreateArtifactRequest, O: CreateArtifactResponse },
{ name: "FinalizeArtifact", options: {}, I: FinalizeArtifactRequest, O: FinalizeArtifactResponse }
]);
{ name: "FinalizeArtifact", options: {}, I: FinalizeArtifactRequest, O: FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: ListArtifactsRequest, O: ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: GetSignedArtifactURLRequest, O: GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: DeleteArtifactRequest, O: DeleteArtifactResponse },
{ name: "MigrateArtifact", options: {}, I: MigrateArtifactRequest, O: MigrateArtifactResponse },
{ name: "FinalizeMigratedArtifact", options: {}, I: FinalizeMigratedArtifactRequest, O: FinalizeMigratedArtifactResponse }
]);

View File

@ -0,0 +1,232 @@
import {
CreateArtifactRequest,
CreateArtifactResponse,
FinalizeArtifactRequest,
FinalizeArtifactResponse,
ListArtifactsRequest,
ListArtifactsResponse,
GetSignedArtifactURLRequest,
GetSignedArtifactURLResponse,
DeleteArtifactRequest,
DeleteArtifactResponse,
} from "./artifact";
//==================================//
// Client Code //
//==================================//
interface Rpc {
request(
service: string,
method: string,
contentType: "application/json" | "application/protobuf",
data: object | Uint8Array
): Promise<object | Uint8Array>;
}
export interface ArtifactServiceClient {
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse>;
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse>;
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse>;
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse>;
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse>;
}
export class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc: Rpc;
constructor(rpc: Rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"CreateArtifact",
"application/json",
data as object
);
return promise.then((data) =>
CreateArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"FinalizeArtifact",
"application/json",
data as object
);
return promise.then((data) =>
FinalizeArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse> {
const data = ListArtifactsRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"ListArtifacts",
"application/json",
data as object
);
return promise.then((data) =>
ListArtifactsResponse.fromJson(data as any, { ignoreUnknownFields: true })
);
}
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse> {
const data = GetSignedArtifactURLRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"GetSignedArtifactURL",
"application/json",
data as object
);
return promise.then((data) =>
GetSignedArtifactURLResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse> {
const data = DeleteArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false,
});
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"DeleteArtifact",
"application/json",
data as object
);
return promise.then((data) =>
DeleteArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true,
})
);
}
}
export class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc: Rpc;
constructor(rpc: Rpc) {
this.rpc = rpc;
this.CreateArtifact.bind(this);
this.FinalizeArtifact.bind(this);
this.ListArtifacts.bind(this);
this.GetSignedArtifactURL.bind(this);
this.DeleteArtifact.bind(this);
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"CreateArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
CreateArtifactResponse.fromBinary(data as Uint8Array)
);
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"FinalizeArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
FinalizeArtifactResponse.fromBinary(data as Uint8Array)
);
}
ListArtifacts(request: ListArtifactsRequest): Promise<ListArtifactsResponse> {
const data = ListArtifactsRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"ListArtifacts",
"application/protobuf",
data
);
return promise.then((data) =>
ListArtifactsResponse.fromBinary(data as Uint8Array)
);
}
GetSignedArtifactURL(
request: GetSignedArtifactURLRequest
): Promise<GetSignedArtifactURLResponse> {
const data = GetSignedArtifactURLRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"GetSignedArtifactURL",
"application/protobuf",
data
);
return promise.then((data) =>
GetSignedArtifactURLResponse.fromBinary(data as Uint8Array)
);
}
DeleteArtifact(
request: DeleteArtifactRequest
): Promise<DeleteArtifactResponse> {
const data = DeleteArtifactRequest.toBinary(request);
const promise = this.rpc.request(
"github.actions.results.api.v1.ArtifactService",
"DeleteArtifact",
"application/protobuf",
data
);
return promise.then((data) =>
DeleteArtifactResponse.fromBinary(data as Uint8Array)
);
}
}

View File

@ -1,437 +0,0 @@
import {
TwirpContext,
TwirpServer,
RouterEvents,
TwirpError,
TwirpErrorCode,
Interceptor,
TwirpContentType,
chainInterceptors
} from 'twirp-ts'
import {
CreateArtifactRequest,
CreateArtifactResponse,
FinalizeArtifactRequest,
FinalizeArtifactResponse
} from './artifact'
//==================================//
// Client Code //
//==================================//
interface Rpc {
request(
service: string,
method: string,
contentType: 'application/json' | 'application/protobuf',
data: object | Uint8Array
): Promise<object | Uint8Array>
}
export interface ArtifactServiceClient {
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse>
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse>
}
export class ArtifactServiceClientJSON implements ArtifactServiceClient {
private readonly rpc: Rpc
constructor(rpc: Rpc) {
this.rpc = rpc
this.CreateArtifact.bind(this)
this.FinalizeArtifact.bind(this)
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false
})
const promise = this.rpc.request(
'github.actions.results.api.v1.ArtifactService',
'CreateArtifact',
'application/json',
data as object
)
return promise.then(data =>
CreateArtifactResponse.fromJson(data as any, {ignoreUnknownFields: true})
)
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toJson(request, {
useProtoFieldName: true,
emitDefaultValues: false
})
const promise = this.rpc.request(
'github.actions.results.api.v1.ArtifactService',
'FinalizeArtifact',
'application/json',
data as object
)
return promise.then(data =>
FinalizeArtifactResponse.fromJson(data as any, {
ignoreUnknownFields: true
})
)
}
}
export class ArtifactServiceClientProtobuf implements ArtifactServiceClient {
private readonly rpc: Rpc
constructor(rpc: Rpc) {
this.rpc = rpc
this.CreateArtifact.bind(this)
this.FinalizeArtifact.bind(this)
}
CreateArtifact(
request: CreateArtifactRequest
): Promise<CreateArtifactResponse> {
const data = CreateArtifactRequest.toBinary(request)
const promise = this.rpc.request(
'github.actions.results.api.v1.ArtifactService',
'CreateArtifact',
'application/protobuf',
data
)
return promise.then(data =>
CreateArtifactResponse.fromBinary(data as Uint8Array)
)
}
FinalizeArtifact(
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse> {
const data = FinalizeArtifactRequest.toBinary(request)
const promise = this.rpc.request(
'github.actions.results.api.v1.ArtifactService',
'FinalizeArtifact',
'application/protobuf',
data
)
return promise.then(data =>
FinalizeArtifactResponse.fromBinary(data as Uint8Array)
)
}
}
//==================================//
// Server Code //
//==================================//
export interface ArtifactServiceTwirp<T extends TwirpContext = TwirpContext> {
CreateArtifact(
ctx: T,
request: CreateArtifactRequest
): Promise<CreateArtifactResponse>
FinalizeArtifact(
ctx: T,
request: FinalizeArtifactRequest
): Promise<FinalizeArtifactResponse>
}
export enum ArtifactServiceMethod {
CreateArtifact = 'CreateArtifact',
FinalizeArtifact = 'FinalizeArtifact'
}
export const ArtifactServiceMethodList = [
ArtifactServiceMethod.CreateArtifact,
ArtifactServiceMethod.FinalizeArtifact
]
export function createArtifactServiceServer<
T extends TwirpContext = TwirpContext
>(service: ArtifactServiceTwirp<T>) {
return new TwirpServer<ArtifactServiceTwirp, T>({
service,
packageName: 'github.actions.results.api.v1',
serviceName: 'ArtifactService',
methodList: ArtifactServiceMethodList,
matchRoute: matchArtifactServiceRoute
})
}
function matchArtifactServiceRoute<T extends TwirpContext = TwirpContext>(
method: string,
events: RouterEvents<T>
) {
switch (method) {
case 'CreateArtifact':
return async (
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<
T,
CreateArtifactRequest,
CreateArtifactResponse
>[]
) => {
ctx = {...ctx, methodName: 'CreateArtifact'}
await events.onMatch(ctx)
return handleArtifactServiceCreateArtifactRequest(
ctx,
service,
data,
interceptors
)
}
case 'FinalizeArtifact':
return async (
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>[]
) => {
ctx = {...ctx, methodName: 'FinalizeArtifact'}
await events.onMatch(ctx)
return handleArtifactServiceFinalizeArtifactRequest(
ctx,
service,
data,
interceptors
)
}
default:
events.onNotFound()
const msg = `no handler found`
throw new TwirpError(TwirpErrorCode.BadRoute, msg)
}
}
function handleArtifactServiceCreateArtifactRequest<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<T, CreateArtifactRequest, CreateArtifactResponse>[]
): Promise<string | Uint8Array> {
switch (ctx.contentType) {
case TwirpContentType.JSON:
return handleArtifactServiceCreateArtifactJSON<T>(
ctx,
service,
data,
interceptors
)
case TwirpContentType.Protobuf:
return handleArtifactServiceCreateArtifactProtobuf<T>(
ctx,
service,
data,
interceptors
)
default:
const msg = 'unexpected Content-Type'
throw new TwirpError(TwirpErrorCode.BadRoute, msg)
}
}
function handleArtifactServiceFinalizeArtifactRequest<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>[]
): Promise<string | Uint8Array> {
switch (ctx.contentType) {
case TwirpContentType.JSON:
return handleArtifactServiceFinalizeArtifactJSON<T>(
ctx,
service,
data,
interceptors
)
case TwirpContentType.Protobuf:
return handleArtifactServiceFinalizeArtifactProtobuf<T>(
ctx,
service,
data,
interceptors
)
default:
const msg = 'unexpected Content-Type'
throw new TwirpError(TwirpErrorCode.BadRoute, msg)
}
}
async function handleArtifactServiceCreateArtifactJSON<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<T, CreateArtifactRequest, CreateArtifactResponse>[]
) {
let request: CreateArtifactRequest
let response: CreateArtifactResponse
try {
const body = JSON.parse(data.toString() || '{}')
request = CreateArtifactRequest.fromJson(body, {ignoreUnknownFields: true})
} catch (e) {
if (e instanceof Error) {
const msg = 'the json request could not be decoded'
throw new TwirpError(TwirpErrorCode.Malformed, msg).withCause(e, true)
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = chainInterceptors(...interceptors) as Interceptor<
T,
CreateArtifactRequest,
CreateArtifactResponse
>
response = await interceptor(ctx, request!, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq)
})
} else {
response = await service.CreateArtifact(ctx, request!)
}
return JSON.stringify(
CreateArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false
}) as string
)
}
async function handleArtifactServiceFinalizeArtifactJSON<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>[]
) {
let request: FinalizeArtifactRequest
let response: FinalizeArtifactResponse
try {
const body = JSON.parse(data.toString() || '{}')
request = FinalizeArtifactRequest.fromJson(body, {
ignoreUnknownFields: true
})
} catch (e) {
if (e instanceof Error) {
const msg = 'the json request could not be decoded'
throw new TwirpError(TwirpErrorCode.Malformed, msg).withCause(e, true)
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = chainInterceptors(...interceptors) as Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>
response = await interceptor(ctx, request!, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq)
})
} else {
response = await service.FinalizeArtifact(ctx, request!)
}
return JSON.stringify(
FinalizeArtifactResponse.toJson(response, {
useProtoFieldName: true,
emitDefaultValues: false
}) as string
)
}
async function handleArtifactServiceCreateArtifactProtobuf<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<T, CreateArtifactRequest, CreateArtifactResponse>[]
) {
let request: CreateArtifactRequest
let response: CreateArtifactResponse
try {
request = CreateArtifactRequest.fromBinary(data)
} catch (e) {
if (e instanceof Error) {
const msg = 'the protobuf request could not be decoded'
throw new TwirpError(TwirpErrorCode.Malformed, msg).withCause(e, true)
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = chainInterceptors(...interceptors) as Interceptor<
T,
CreateArtifactRequest,
CreateArtifactResponse
>
response = await interceptor(ctx, request!, (ctx, inputReq) => {
return service.CreateArtifact(ctx, inputReq)
})
} else {
response = await service.CreateArtifact(ctx, request!)
}
return Buffer.from(CreateArtifactResponse.toBinary(response))
}
async function handleArtifactServiceFinalizeArtifactProtobuf<
T extends TwirpContext = TwirpContext
>(
ctx: T,
service: ArtifactServiceTwirp,
data: Buffer,
interceptors?: Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>[]
) {
let request: FinalizeArtifactRequest
let response: FinalizeArtifactResponse
try {
request = FinalizeArtifactRequest.fromBinary(data)
} catch (e) {
if (e instanceof Error) {
const msg = 'the protobuf request could not be decoded'
throw new TwirpError(TwirpErrorCode.Malformed, msg).withCause(e, true)
}
}
if (interceptors && interceptors.length > 0) {
const interceptor = chainInterceptors(...interceptors) as Interceptor<
T,
FinalizeArtifactRequest,
FinalizeArtifactResponse
>
response = await interceptor(ctx, request!, (ctx, inputReq) => {
return service.FinalizeArtifact(ctx, inputReq)
})
} else {
response = await service.FinalizeArtifact(ctx, request!)
}
return Buffer.from(FinalizeArtifactResponse.toBinary(response))
}

View File

@ -1,122 +1,126 @@
import {warning} from '@actions/core'
import {isGhes} from './shared/config'
import {
UploadOptions,
UploadResponse,
UploadArtifactOptions,
UploadArtifactResponse,
DownloadArtifactOptions,
GetArtifactResponse,
ListArtifactsOptions,
ListArtifactsResponse,
DownloadArtifactResponse
DownloadArtifactResponse,
FindOptions,
DeleteArtifactResponse
} from './shared/interfaces'
import {uploadArtifact} from './upload/upload-artifact'
import {downloadArtifact} from './download/download-artifact'
import {getArtifact} from './find/get-artifact'
import {listArtifacts} from './find/list-artifacts'
import {
downloadArtifactPublic,
downloadArtifactInternal
} from './download/download-artifact'
import {
deleteArtifactPublic,
deleteArtifactInternal
} from './delete/delete-artifact'
import {getArtifactPublic, getArtifactInternal} from './find/get-artifact'
import {listArtifactsPublic, listArtifactsInternal} from './find/list-artifacts'
import {GHESNotSupportedError} from './shared/errors'
/**
* Generic interface for the artifact client.
*/
export interface ArtifactClient {
/**
* Uploads an artifact
* Uploads an artifact.
*
* @param name The name of the artifact, required
* @param files A list of absolute or relative paths that denote what files should be uploaded
* @param rootDirectory An absolute or relative file path that denotes the root parent directory of the files being uploaded
* @param options Extra options for customizing the upload behavior
* @returns single UploadResponse object
* @returns single UploadArtifactResponse object
*/
uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadOptions
): Promise<UploadResponse>
options?: UploadArtifactOptions
): Promise<UploadArtifactResponse>
/**
* Lists all artifacts that are part of a workflow run.
* Lists all artifacts that are part of the current workflow run.
* This function will return at most 1000 artifacts per workflow run.
*
* This calls the public List-Artifacts API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
* Due to paginated responses from the public API. This function will return at most 1000 artifacts per workflow run (100 per page * maximum 10 calls)
* If `options.findBy` is specified, this will call the public List-Artifacts API which can list from other runs.
* https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
*
* @param workflowRunId The workflow run id that the artifact belongs to
* @param repositoryOwner The owner of the repository that the artifact belongs to
* @param repositoryName The name of the repository that the artifact belongs to
* @param token A token with the appropriate permission to the repository to list artifacts
* @param options Extra options that allow for the customization of the list behavior
* @returns ListArtifactResponse object
*/
listArtifacts(
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
options?: ListArtifactsOptions & FindOptions
): Promise<ListArtifactsResponse>
/**
* Finds an artifact by name given a repository and workflow run id.
* Finds an artifact by name.
* If there are multiple artifacts with the same name in the same workflow run, this will return the latest.
* If the artifact is not found, it will throw.
*
* This calls the public List-Artifacts API with a name filter https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
* @actions/artifact > 2.0.0 does not allow for creating multiple artifacts with the same name in the same workflow run.
* It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3) or @actions/artifact < v2.0.0
* If `options.findBy` is specified, this will use the public List Artifacts API with a name filter which can get artifacts from other runs.
* https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#list-workflow-run-artifacts
* `@actions/artifact` v2+ does not allow for creating multiple artifacts with the same name in the same workflow run.
* It is possible to have multiple artifacts with the same name in the same workflow run by using old versions of upload-artifact (v1,v2 and v3), @actions/artifact < v2 or it is a rerun.
* If there are multiple artifacts with the same name in the same workflow run this function will return the first artifact that matches the name.
*
* @param artifactName The name of the artifact to find
* @param workflowRunId The workflow run id that the artifact belongs to
* @param repositoryOwner The owner of the repository that the artifact belongs to
* @param repositoryName The name of the repository that the artifact belongs to
* @param token A token with the appropriate permission to the repository to find the artifact
* @param options Extra options that allow for the customization of the get behavior
*/
getArtifact(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
options?: FindOptions
): Promise<GetArtifactResponse>
/**
* Downloads an artifact and unzips the content
* Downloads an artifact and unzips the content.
*
* @param artifactId The name of the artifact to download
* @param repositoryOwner The owner of the repository that the artifact belongs to
* @param repositoryName The name of the repository that the artifact belongs to
* @param token A token with the appropriate permission to the repository to download the artifact
* If `options.findBy` is specified, this will use the public Download Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact
*
* @param artifactId The id of the artifact to download
* @param options Extra options that allow for the customization of the download behavior
* @returns single DownloadArtifactResponse object
*/
downloadArtifact(
artifactId: number,
repositoryOwner: string,
repositoryName: string,
token: string,
options?: DownloadArtifactOptions
options?: DownloadArtifactOptions & FindOptions
): Promise<DownloadArtifactResponse>
/**
* Delete an Artifact
*
* If `options.findBy` is specified, this will use the public Delete Artifact API https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#delete-an-artifact
*
* @param artifactName The name of the artifact to delete
* @param options Extra options that allow for the customization of the delete behavior
* @returns single DeleteArtifactResponse object
*/
deleteArtifact(
artifactName: string,
options?: FindOptions
): Promise<DeleteArtifactResponse>
}
export class Client implements ArtifactClient {
/**
* Constructs a Client
*/
static create(): Client {
return new Client()
}
/**
* Upload Artifact
*/
/**
* The default artifact client that is used by the artifact action(s).
*/
export class DefaultArtifactClient implements ArtifactClient {
async uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadOptions | undefined
): Promise<UploadResponse> {
if (isGhes()) {
warning(
`@actions/artifact v2.0.0+ and upload-artifact@v4+ are not currently supported on GHES.`
)
return {
success: false
}
}
options?: UploadArtifactOptions
): Promise<UploadArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
return uploadArtifact(name, files, rootDirectory, options)
} catch (error) {
warning(
@ -126,79 +130,72 @@ Errors can be temporary, so please try again and optionally run the action with
If the error persists, please check whether Actions is operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
return {
success: false
}
throw error
}
}
/**
* Download Artifact
*/
async downloadArtifact(
artifactId: number,
repositoryOwner: string,
repositoryName: string,
token: string,
options?: DownloadArtifactOptions
options?: DownloadArtifactOptions & FindOptions
): Promise<DownloadArtifactResponse> {
if (isGhes()) {
warning(
`@actions/artifact v2.0.0+ and download-artifact@v4+ are not currently supported on GHES.`
)
return {
success: false
}
}
try {
return downloadArtifact(
artifactId,
repositoryOwner,
repositoryName,
token,
options
)
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {repositoryOwner, repositoryName, token},
...downloadOptions
} = options
return downloadArtifactPublic(
artifactId,
repositoryOwner,
repositoryName,
token,
downloadOptions
)
}
return downloadArtifactInternal(artifactId, options)
} catch (error) {
warning(
`Artifact download failed with error: ${error}.
`Download Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
return {
success: false
}
throw error
}
}
/**
* List Artifacts
*/
async listArtifacts(
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
options?: ListArtifactsOptions & FindOptions
): Promise<ListArtifactsResponse> {
if (isGhes()) {
warning(
`@actions/artifact v2.0.0+ and download-artifact@v4+ are not currently supported on GHES.`
)
return {
artifacts: []
}
}
try {
return listArtifacts(
workflowRunId,
repositoryOwner,
repositoryName,
token
)
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {workflowRunId, repositoryOwner, repositoryName, token}
} = options
return listArtifactsPublic(
workflowRunId,
repositoryOwner,
repositoryName,
token,
options?.latest
)
}
return listArtifactsInternal(options?.latest)
} catch (error: unknown) {
warning(
`Listing Artifacts failed with error: ${error}.
@ -208,50 +205,80 @@ Errors can be temporary, so please try again and optionally run the action with
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
return {
artifacts: []
}
throw error
}
}
/**
* Get Artifact
*/
async getArtifact(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
options?: FindOptions
): Promise<GetArtifactResponse> {
if (isGhes()) {
warning(
`@actions/artifact v2.0.0+ and download-artifact@v4+ are not currently supported on GHES.`
)
return {
success: false
}
}
try {
return getArtifact(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {workflowRunId, repositoryOwner, repositoryName, token}
} = options
return getArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
}
return getArtifactInternal(artifactName)
} catch (error: unknown) {
warning(
`Fetching Artifact failed with error: ${error}.
`Get Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
return {
success: false
throw error
}
}
async deleteArtifact(
artifactName: string,
options?: FindOptions
): Promise<DeleteArtifactResponse> {
try {
if (isGhes()) {
throw new GHESNotSupportedError()
}
if (options?.findBy) {
const {
findBy: {repositoryOwner, repositoryName, workflowRunId, token}
} = options
return deleteArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
}
return deleteArtifactInternal(artifactName)
} catch (error) {
warning(
`Delete Artifact failed with error: ${error}.
Errors can be temporary, so please try again and optionally run the action with debug mode enabled for more information.
If the error persists, please check whether Actions and API requests are operating normally at [https://githubstatus.com](https://www.githubstatus.com).`
)
throw error
}
}
}

View File

@ -0,0 +1,109 @@
import {info, debug} from '@actions/core'
import {getOctokit} from '@actions/github'
import {DeleteArtifactResponse} from '../shared/interfaces'
import {getUserAgentString} from '../shared/user-agent'
import {getRetryOptions} from '../find/retry-options'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {requestLog} from '@octokit/plugin-request-log'
import {retry} from '@octokit/plugin-retry'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {getBackendIdsFromToken} from '../shared/util'
import {
DeleteArtifactRequest,
ListArtifactsRequest,
StringValue
} from '../../generated'
import {getArtifactPublic} from '../find/get-artifact'
import {ArtifactNotFoundError, InvalidResponseError} from '../shared/errors'
export async function deleteArtifactPublic(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
): Promise<DeleteArtifactResponse> {
const [retryOpts, requestOpts] = getRetryOptions(defaultGitHubOptions)
const opts: OctokitOptions = {
log: undefined,
userAgent: getUserAgentString(),
previews: undefined,
retry: retryOpts,
request: requestOpts
}
const github = getOctokit(token, opts, retry, requestLog)
const getArtifactResp = await getArtifactPublic(
artifactName,
workflowRunId,
repositoryOwner,
repositoryName,
token
)
const deleteArtifactResp = await github.rest.actions.deleteArtifact({
owner: repositoryOwner,
repo: repositoryName,
artifact_id: getArtifactResp.artifact.id
})
if (deleteArtifactResp.status !== 204) {
throw new InvalidResponseError(
`Invalid response from GitHub API: ${deleteArtifactResp.status} (${deleteArtifactResp?.headers?.['x-github-request-id']})`
)
}
return {
id: getArtifactResp.artifact.id
}
}
export async function deleteArtifactInternal(
artifactName
): Promise<DeleteArtifactResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const listReq: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: StringValue.create({value: artifactName})
}
const listRes = await artifactClient.ListArtifacts(listReq)
if (listRes.artifacts.length === 0) {
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}`
)
}
let artifact = listRes.artifacts[0]
if (listRes.artifacts.length > 1) {
artifact = listRes.artifacts.sort(
(a, b) => Number(b.databaseId) - Number(a.databaseId)
)[0]
debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`
)
}
const req: DeleteArtifactRequest = {
workflowRunBackendId: artifact.workflowRunBackendId,
workflowJobRunBackendId: artifact.workflowJobRunBackendId,
name: artifact.name
}
const res = await artifactClient.DeleteArtifact(req)
info(`Artifact '${artifactName}' (ID: ${res.artifactId}) deleted`)
return {
id: Number(res.artifactId)
}
}

View File

@ -1,14 +1,26 @@
import fs from 'fs/promises'
import * as crypto from 'crypto'
import * as stream from 'stream'
import * as github from '@actions/github'
import * as core from '@actions/core'
import * as httpClient from '@actions/http-client'
import unzipper from 'unzipper'
import unzip from 'unzip-stream'
import {
DownloadArtifactOptions,
DownloadArtifactResponse
DownloadArtifactResponse,
StreamExtractResponse
} from '../shared/interfaces'
import {getUserAgentString} from '../shared/user-agent'
import {getGitHubWorkspaceDir} from '../shared/config'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {
GetSignedArtifactURLRequest,
Int64Value,
ListArtifactsRequest
} from '../../generated'
import {getBackendIdsFromToken} from '../shared/util'
import {ArtifactNotFoundError} from '../shared/errors'
const scrubQueryParameters = (url: string): string => {
const parsed = new URL(url)
@ -29,39 +41,99 @@ async function exists(path: string): Promise<boolean> {
}
}
async function streamExtract(url: string, directory: string): Promise<void> {
async function streamExtract(
url: string,
directory: string
): Promise<StreamExtractResponse> {
let retryCount = 0
while (retryCount < 5) {
try {
return await streamExtractExternal(url, directory)
} catch (error) {
retryCount++
core.debug(
`Failed to download artifact after ${retryCount} retries due to ${error.message}. Retrying in 5 seconds...`
)
// wait 5 seconds before retrying
await new Promise(resolve => setTimeout(resolve, 5000))
}
}
throw new Error(`Artifact download failed after ${retryCount} retries.`)
}
export async function streamExtractExternal(
url: string,
directory: string,
opts: {timeout: number} = {timeout: 30 * 1000}
): Promise<StreamExtractResponse> {
const client = new httpClient.HttpClient(getUserAgentString())
const response = await client.get(url)
if (response.message.statusCode !== 200) {
throw new Error(
`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`
)
}
return response.message.pipe(unzipper.Extract({path: directory})).promise()
let sha256Digest: string | undefined = undefined
return new Promise((resolve, reject) => {
const timerFn = (): void => {
const timeoutError = new Error(
`Blob storage chunk did not respond in ${opts.timeout}ms`
)
response.message.destroy(timeoutError)
reject(timeoutError)
}
const timer = setTimeout(timerFn, opts.timeout)
const hashStream = crypto.createHash('sha256').setEncoding('hex')
const passThrough = new stream.PassThrough()
response.message.pipe(passThrough)
passThrough.pipe(hashStream)
const extractStream = passThrough
extractStream
.on('data', () => {
timer.refresh()
})
.on('error', (error: Error) => {
core.debug(
`response.message: Artifact download failed: ${error.message}`
)
clearTimeout(timer)
reject(error)
})
.pipe(unzip.Extract({path: directory}))
.on('close', () => {
clearTimeout(timer)
if (hashStream) {
hashStream.end()
sha256Digest = hashStream.read() as string
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`)
}
resolve({sha256Digest: `sha256:${sha256Digest}`})
})
.on('error', (error: Error) => {
reject(error)
})
})
}
export async function downloadArtifact(
export async function downloadArtifactPublic(
artifactId: number,
repositoryOwner: string,
repositoryName: string,
token: string,
options?: DownloadArtifactOptions
): Promise<DownloadArtifactResponse> {
const downloadPath = options?.path || getGitHubWorkspaceDir()
if (!(await exists(downloadPath))) {
core.debug(
`Artifact destination folder does not exist, creating: ${downloadPath}`
)
await fs.mkdir(downloadPath, {recursive: true})
} else {
core.debug(`Artifact destination folder already exists: ${downloadPath}`)
}
const downloadPath = await resolveOrCreateDirectory(options?.path)
const api = github.getOctokit(token)
let digestMismatch = false
core.info(
`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`
)
@ -91,11 +163,94 @@ export async function downloadArtifact(
try {
core.info(`Starting download of artifact to: ${downloadPath}`)
await streamExtract(location, downloadPath)
const extractResponse = await streamExtract(location, downloadPath)
core.info(`Artifact download completed successfully.`)
if (options?.expectedHash) {
if (options?.expectedHash !== extractResponse.sha256Digest) {
digestMismatch = true
core.debug(`Computed digest: ${extractResponse.sha256Digest}`)
core.debug(`Expected digest: ${options.expectedHash}`)
}
}
} catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`)
}
return {success: true, downloadPath}
return {downloadPath, digestMismatch}
}
export async function downloadArtifactInternal(
artifactId: number,
options?: DownloadArtifactOptions
): Promise<DownloadArtifactResponse> {
const downloadPath = await resolveOrCreateDirectory(options?.path)
const artifactClient = internalArtifactTwirpClient()
let digestMismatch = false
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const listReq: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
idFilter: Int64Value.create({value: artifactId.toString()})
}
const {artifacts} = await artifactClient.ListArtifacts(listReq)
if (artifacts.length === 0) {
throw new ArtifactNotFoundError(
`No artifacts found for ID: ${artifactId}\nAre you trying to download from a different run? Try specifying a github-token with \`actions:read\` scope.`
)
}
if (artifacts.length > 1) {
core.warning('Multiple artifacts found, defaulting to first.')
}
const signedReq: GetSignedArtifactURLRequest = {
workflowRunBackendId: artifacts[0].workflowRunBackendId,
workflowJobRunBackendId: artifacts[0].workflowJobRunBackendId,
name: artifacts[0].name
}
const {signedUrl} = await artifactClient.GetSignedArtifactURL(signedReq)
core.info(
`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`
)
try {
core.info(`Starting download of artifact to: ${downloadPath}`)
const extractResponse = await streamExtract(signedUrl, downloadPath)
core.info(`Artifact download completed successfully.`)
if (options?.expectedHash) {
if (options?.expectedHash !== extractResponse.sha256Digest) {
digestMismatch = true
core.debug(`Computed digest: ${extractResponse.sha256Digest}`)
core.debug(`Expected digest: ${options.expectedHash}`)
}
}
} catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`)
}
return {downloadPath, digestMismatch}
}
async function resolveOrCreateDirectory(
downloadPath = getGitHubWorkspaceDir()
): Promise<string> {
if (!(await exists(downloadPath))) {
core.debug(
`Artifact destination folder does not exist, creating: ${downloadPath}`
)
await fs.mkdir(downloadPath, {recursive: true})
} else {
core.debug(`Artifact destination folder already exists: ${downloadPath}`)
}
return downloadPath
}

View File

@ -1,14 +1,18 @@
import {GetArtifactResponse} from '../shared/interfaces'
import {getOctokit} from '@actions/github'
import {getUserAgentString} from '../shared/user-agent'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {getRetryOptions} from './retry-options'
import {requestLog} from '@octokit/plugin-request-log'
import {retry} from '@octokit/plugin-retry'
import * as core from '@actions/core'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {getRetryOptions} from './retry-options'
import {requestLog} from '@octokit/plugin-request-log'
import {GetArtifactResponse} from '../shared/interfaces'
import {getBackendIdsFromToken} from '../shared/util'
import {getUserAgentString} from '../shared/user-agent'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {ListArtifactsRequest, StringValue, Timestamp} from '../../generated'
import {ArtifactNotFoundError, InvalidResponseError} from '../shared/errors'
export async function getArtifact(
export async function getArtifactPublic(
artifactName: string,
workflowRunId: number,
repositoryOwner: string,
@ -38,32 +42,84 @@ export async function getArtifact(
)
if (getArtifactResp.status !== 200) {
core.warning(`non-200 response from GitHub API: ${getArtifactResp.status}`)
return {
success: false
}
throw new InvalidResponseError(
`Invalid response from GitHub API: ${getArtifactResp.status} (${getArtifactResp?.headers?.['x-github-request-id']})`
)
}
if (getArtifactResp.data.artifacts.length === 0) {
core.warning('no artifacts found')
return {
success: false
}
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`
)
}
let artifact = getArtifactResp.data.artifacts[0]
if (getArtifactResp.data.artifacts.length > 1) {
core.warning(
'more than one artifact found for a single name, returning first'
artifact = getArtifactResp.data.artifacts.sort((a, b) => b.id - a.id)[0]
core.debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.id})`
)
}
return {
success: true,
artifact: {
name: getArtifactResp.data.artifacts[0].name,
id: getArtifactResp.data.artifacts[0].id,
url: getArtifactResp.data.artifacts[0].url,
size: getArtifactResp.data.artifacts[0].size_in_bytes
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
}
}
}
export async function getArtifactInternal(
artifactName: string
): Promise<GetArtifactResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const req: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId,
nameFilter: StringValue.create({value: artifactName})
}
const res = await artifactClient.ListArtifacts(req)
if (res.artifacts.length === 0) {
throw new ArtifactNotFoundError(
`Artifact not found for name: ${artifactName}
Please ensure that your artifact is not expired and the artifact was uploaded using a compatible version of toolkit/upload-artifact.
For more information, visit the GitHub Artifacts FAQ: https://github.com/actions/toolkit/blob/main/packages/artifact/docs/faq.md`
)
}
let artifact = res.artifacts[0]
if (res.artifacts.length > 1) {
artifact = res.artifacts.sort(
(a, b) => Number(b.databaseId) - Number(a.databaseId)
)[0]
core.debug(
`More than one artifact found for a single name, returning newest (id: ${artifact.databaseId})`
)
}
return {
artifact: {
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? Timestamp.toDate(artifact.createdAt)
: undefined,
digest: artifact.digest?.value
}
}
}

View File

@ -7,23 +7,27 @@ import {defaults as defaultGitHubOptions} from '@actions/github/lib/utils'
import {requestLog} from '@octokit/plugin-request-log'
import {retry} from '@octokit/plugin-retry'
import {OctokitOptions} from '@octokit/core/dist-types/types'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {getBackendIdsFromToken} from '../shared/util'
import {getMaxArtifactListCount} from '../shared/config'
import {ListArtifactsRequest, Timestamp} from '../../generated'
// Limiting to 1000 for perf reasons
const maximumArtifactCount = 1000
const maximumArtifactCount = getMaxArtifactListCount()
const paginationCount = 100
const maxNumberOfPages = maximumArtifactCount / paginationCount
const maxNumberOfPages = Math.ceil(maximumArtifactCount / paginationCount)
export async function listArtifacts(
export async function listArtifactsPublic(
workflowRunId: number,
repositoryOwner: string,
repositoryName: string,
token: string
token: string,
latest = false
): Promise<ListArtifactsResponse> {
info(
`Fetching artifact list for workflow run ${workflowRunId} in repository ${repositoryOwner}/${repositoryName}`
)
const artifacts: Artifact[] = []
let artifacts: Artifact[] = []
const [retryOpts, requestOpts] = getRetryOptions(defaultGitHubOptions)
const opts: OctokitOptions = {
@ -37,14 +41,17 @@ export async function listArtifacts(
const github = getOctokit(token, opts, retry, requestLog)
let currentPageNumber = 1
const {data: listArtifactResponse} =
await github.rest.actions.listWorkflowRunArtifacts({
const {data: listArtifactResponse} = await github.request(
'GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts',
{
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
per_page: paginationCount,
page: currentPageNumber
})
}
)
let numberOfPages = Math.ceil(
listArtifactResponse.total_count / paginationCount
@ -52,7 +59,7 @@ export async function listArtifacts(
const totalArtifactCount = listArtifactResponse.total_count
if (totalArtifactCount > maximumArtifactCount) {
warning(
`Workflow run ${workflowRunId} has more than 1000 artifacts. Results will be incomplete as only the first ${maximumArtifactCount} artifacts will be returned`
`Workflow run ${workflowRunId} has ${totalArtifactCount} artifacts, exceeding the limit of ${maximumArtifactCount}. Results will be incomplete as only the first ${maximumArtifactCount} artifacts will be returned`
)
numberOfPages = maxNumberOfPages
}
@ -62,42 +69,119 @@ export async function listArtifacts(
artifacts.push({
name: artifact.name,
id: artifact.id,
url: artifact.url,
size: artifact.size_in_bytes
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: (artifact as ArtifactResponse).digest
})
}
// Move to the next page
currentPageNumber++
// Iterate over any remaining pages
for (
currentPageNumber;
currentPageNumber < numberOfPages;
currentPageNumber <= numberOfPages;
currentPageNumber++
) {
currentPageNumber++
debug(`Fetching page ${currentPageNumber} of artifact list`)
const {data: listArtifactResponse} =
await github.rest.actions.listWorkflowRunArtifacts({
const {data: listArtifactResponse} = await github.request(
'GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts',
{
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
per_page: paginationCount,
page: currentPageNumber
})
}
)
for (const artifact of listArtifactResponse.artifacts) {
artifacts.push({
name: artifact.name,
id: artifact.id,
url: artifact.url,
size: artifact.size_in_bytes
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: (artifact as ArtifactResponse).digest
})
}
}
info(`Finished fetching artifact list`)
if (latest) {
artifacts = filterLatest(artifacts)
}
info(`Found ${artifacts.length} artifact(s)`)
return {
artifacts
}
}
export async function listArtifactsInternal(
latest = false
): Promise<ListArtifactsResponse> {
const artifactClient = internalArtifactTwirpClient()
const {workflowRunBackendId, workflowJobRunBackendId} =
getBackendIdsFromToken()
const req: ListArtifactsRequest = {
workflowRunBackendId,
workflowJobRunBackendId
}
const res = await artifactClient.ListArtifacts(req)
let artifacts: Artifact[] = res.artifacts.map(artifact => ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? Timestamp.toDate(artifact.createdAt)
: undefined,
digest: artifact.digest?.value
}))
if (latest) {
artifacts = filterLatest(artifacts)
}
info(`Found ${artifacts.length} artifact(s)`)
return {
artifacts
}
}
/**
* This exists so that we don't have to use 'any' when receiving the artifact list from the GitHub API.
* The digest field is not present in OpenAPI/types at time of writing, which necessitates this change.
*/
interface ArtifactResponse {
name: string
id: number
size_in_bytes: number
created_at?: string
digest?: string
}
/**
* Filters a list of artifacts to only include the latest artifact for each name
* @param artifacts The artifacts to filter
* @returns The filtered list of artifacts
*/
function filterLatest(artifacts: Artifact[]): Artifact[] {
artifacts.sort((a, b) => b.id - a.id)
const latestArtifacts: Artifact[] = []
const seenArtifactNames = new Set<string>()
for (const artifact of artifacts) {
if (!seenArtifactNames.has(artifact.name)) {
latestArtifacts.push(artifact)
seenArtifactNames.add(artifact.name)
}
}
return latestArtifacts
}

View File

@ -3,6 +3,9 @@ import {BearerCredentialHandler} from '@actions/http-client/lib/auth'
import {info, debug} from '@actions/core'
import {ArtifactServiceClientJSON} from '../../generated'
import {getResultsServiceUrl, getRuntimeToken} from './config'
import {getUserAgentString} from './user-agent'
import {NetworkError, UsageError} from './errors'
import {maskSecretUrls} from './util'
// The twirp http client must implement this interface
interface Rpc {
@ -52,17 +55,17 @@ class ArtifactHttpClient implements Rpc {
contentType: 'application/json' | 'application/protobuf',
data: object | Uint8Array
): Promise<object | Uint8Array> {
const url = `${this.baseUrl}/twirp/${service}/${method}`
debug(`Requesting ${url}`)
const url = new URL(`/twirp/${service}/${method}`, this.baseUrl).href
debug(`[Request] ${method} ${url}`)
const headers = {
'Content-Type': contentType
}
try {
const response = await this.retryableRequest(async () =>
const {body} = await this.retryableRequest(async () =>
this.httpClient.post(url, JSON.stringify(data), headers)
)
const body = await response.readBody()
return JSON.parse(body)
return body
} catch (error) {
throw new Error(`Failed to ${method}: ${error.message}`)
}
@ -70,23 +73,47 @@ class ArtifactHttpClient implements Rpc {
async retryableRequest(
operation: () => Promise<HttpClientResponse>
): Promise<HttpClientResponse> {
): Promise<{response: HttpClientResponse; body: object}> {
let attempt = 0
let errorMessage = ''
let rawBody = ''
while (attempt < this.maxAttempts) {
let isRetryable = false
try {
const response = await operation()
const statusCode = response.message.statusCode
rawBody = await response.readBody()
debug(`[Response] - ${response.message.statusCode}`)
debug(`Headers: ${JSON.stringify(response.message.headers, null, 2)}`)
const body = JSON.parse(rawBody)
maskSecretUrls(body)
debug(`Body: ${JSON.stringify(body, null, 2)}`)
if (this.isSuccessStatusCode(statusCode)) {
return response
return {response, body}
}
isRetryable = this.isRetryableHttpStatusCode(statusCode)
errorMessage = `Failed request: (${statusCode}) ${response.message.statusMessage}`
if (body.msg) {
if (UsageError.isUsageErrorMessage(body.msg)) {
throw new UsageError()
}
errorMessage = `${errorMessage}: ${body.msg}`
}
} catch (error) {
if (error instanceof SyntaxError) {
debug(`Raw Body: ${rawBody}`)
}
if (error instanceof UsageError) {
throw error
}
if (NetworkError.isNetworkErrorCode(error?.code)) {
throw new NetworkError(error?.code)
}
isRetryable = true
errorMessage = error.message
}
@ -128,8 +155,7 @@ class ArtifactHttpClient implements Rpc {
HttpCodes.GatewayTimeout,
HttpCodes.InternalServerError,
HttpCodes.ServiceUnavailable,
HttpCodes.TooManyRequests,
413 // Payload Too Large
HttpCodes.TooManyRequests
]
return retryableStatusCodes.includes(statusCode)
@ -157,17 +183,16 @@ class ArtifactHttpClient implements Rpc {
}
}
export function createArtifactTwirpClient(
type: 'upload' | 'download',
maxAttempts?: number,
baseRetryIntervalMilliseconds?: number,
export function internalArtifactTwirpClient(options?: {
maxAttempts?: number
retryIntervalMs?: number
retryMultiplier?: number
): ArtifactServiceClientJSON {
}): ArtifactServiceClientJSON {
const client = new ArtifactHttpClient(
`@actions/artifact-${type}`,
maxAttempts,
baseRetryIntervalMilliseconds,
retryMultiplier
getUserAgentString(),
options?.maxAttempts,
options?.retryIntervalMs,
options?.retryMultiplier
)
return new ArtifactServiceClientJSON(client)
}

View File

@ -1,3 +1,6 @@
import os from 'os'
import {info} from '@actions/core'
// Used for controlling the highWaterMark value of the zip that is being streamed
// The same value is used as the chunk size that is use during upload to blob storage
export function getUploadChunkSize(): number {
@ -17,14 +20,21 @@ export function getResultsServiceUrl(): string {
if (!resultsUrl) {
throw new Error('Unable to get the ACTIONS_RESULTS_URL env variable')
}
return resultsUrl
return new URL(resultsUrl).origin
}
export function isGhes(): boolean {
const ghUrl = new URL(
process.env['GITHUB_SERVER_URL'] || 'https://github.com'
)
return ghUrl.hostname.toUpperCase() !== 'GITHUB.COM'
const hostname = ghUrl.hostname.trimEnd().toUpperCase()
const isGitHubHost = hostname === 'GITHUB.COM'
const isGheHost = hostname.endsWith('.GHE.COM')
const isLocalHost = hostname.endsWith('.LOCALHOST')
return !isGitHubHost && !isGheHost && !isLocalHost
}
export function getGitHubWorkspaceDir(): string {
@ -34,3 +44,72 @@ export function getGitHubWorkspaceDir(): string {
}
return ghWorkspaceDir
}
// The maximum value of concurrency is 300.
// This value can be changed with ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY variable.
export function getConcurrency(): number {
const numCPUs = os.cpus().length
let concurrencyCap = 32
if (numCPUs > 4) {
const concurrency = 16 * numCPUs
concurrencyCap = concurrency > 300 ? 300 : concurrency
}
const concurrencyOverride = process.env['ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY']
if (concurrencyOverride) {
const concurrency = parseInt(concurrencyOverride)
if (isNaN(concurrency) || concurrency < 1) {
throw new Error(
'Invalid value set for ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY env variable'
)
}
if (concurrency < concurrencyCap) {
info(
`Set concurrency based on the value set in ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY.`
)
return concurrency
}
info(
`ACTIONS_ARTIFACT_UPLOAD_CONCURRENCY is higher than the cap of ${concurrencyCap} based on the number of cpus. Set it to the maximum value allowed.`
)
return concurrencyCap
}
// default concurrency to 5
return 5
}
export function getUploadChunkTimeout(): number {
const timeoutVar = process.env['ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS']
if (!timeoutVar) {
return 300000 // 5 minutes
}
const timeout = parseInt(timeoutVar)
if (isNaN(timeout)) {
throw new Error(
'Invalid value set for ACTIONS_ARTIFACT_UPLOAD_TIMEOUT_MS env variable'
)
}
return timeout
}
// This value can be changed with ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT variable.
// Defaults to 1000 as a safeguard for rate limiting.
export function getMaxArtifactListCount(): number {
const maxCountVar =
process.env['ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT'] || '1000'
const maxCount = parseInt(maxCountVar)
if (isNaN(maxCount) || maxCount < 1) {
throw new Error(
'Invalid value set for ACTIONS_ARTIFACT_MAX_ARTIFACT_COUNT env variable'
)
}
return maxCount
}

View File

@ -0,0 +1,72 @@
export class FilesNotFoundError extends Error {
files: string[]
constructor(files: string[] = []) {
let message = 'No files were found to upload'
if (files.length > 0) {
message += `: ${files.join(', ')}`
}
super(message)
this.files = files
this.name = 'FilesNotFoundError'
}
}
export class InvalidResponseError extends Error {
constructor(message: string) {
super(message)
this.name = 'InvalidResponseError'
}
}
export class ArtifactNotFoundError extends Error {
constructor(message = 'Artifact not found') {
super(message)
this.name = 'ArtifactNotFoundError'
}
}
export class GHESNotSupportedError extends Error {
constructor(
message = '@actions/artifact v2.0.0+, upload-artifact@v4+ and download-artifact@v4+ are not currently supported on GHES.'
) {
super(message)
this.name = 'GHESNotSupportedError'
}
}
export class NetworkError extends Error {
code: string
constructor(code: string) {
const message = `Unable to make request: ${code}\nIf you are using self-hosted runners, please make sure your runner has access to all GitHub endpoints: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github`
super(message)
this.code = code
this.name = 'NetworkError'
}
static isNetworkErrorCode = (code?: string): boolean => {
if (!code) return false
return [
'ECONNRESET',
'ENOTFOUND',
'ETIMEDOUT',
'ECONNREFUSED',
'EHOSTUNREACH'
].includes(code)
}
}
export class UsageError extends Error {
constructor() {
const message = `Artifact storage quota has been hit. Unable to upload any new artifacts. Usage is recalculated every 6-12 hours.\nMore info on storage limits: https://docs.github.com/en/billing/managing-billing-for-github-actions/about-billing-for-github-actions#calculating-minute-and-storage-spending`
super(message)
this.name = 'UsageError'
}
static isUsageErrorMessage = (msg?: string): boolean => {
if (!msg) return false
return msg.includes('insufficient usage')
}
}

View File

@ -1,14 +1,7 @@
/*****************************************************************************
* *
* UploadArtifact *
* *
*****************************************************************************/
export interface UploadResponse {
/**
* Denotes if an artifact was successfully uploaded
*/
success: boolean
/**
* Response from the server when an artifact is uploaded
*/
export interface UploadArtifactResponse {
/**
* Total size of the artifact in bytes. Not provided if no artifact was uploaded
*/
@ -19,9 +12,17 @@ export interface UploadResponse {
* This ID can be used as input to other APIs to download, delete or get more information about an artifact: https://docs.github.com/en/rest/actions/artifacts
*/
id?: number
/**
* The SHA256 digest of the artifact that was created. Not provided if no artifact was uploaded
*/
digest?: string
}
export interface UploadOptions {
/**
* Options for uploading an artifact
*/
export interface UploadArtifactOptions {
/**
* Duration after which artifact will expire in days.
*
@ -38,31 +39,43 @@ export interface UploadOptions {
* input of 0 assumes default retention setting.
*/
retentionDays?: number
/**
* The level of compression for Zlib to be applied to the artifact archive.
* The value can range from 0 to 9:
* - 0: No compression
* - 1: Best speed
* - 6: Default compression (same as GNU Gzip)
* - 9: Best compression
* Higher levels will result in better compression, but will take longer to complete.
* For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
*/
compressionLevel?: number
}
/*****************************************************************************
* *
* GetArtifact *
* *
*****************************************************************************/
/**
* Response from the server when getting an artifact
*/
export interface GetArtifactResponse {
/**
* If an artifact was found
*/
success: boolean
/**
* Metadata about the artifact that was found
*/
artifact?: Artifact
artifact: Artifact
}
/*****************************************************************************
* *
* ListArtifact *
* *
*****************************************************************************/
/**
* Options for listing artifacts
*/
export interface ListArtifactsOptions {
/**
* Filter the workflow run's artifacts to the latest by name
* In the case of reruns, this can be useful to avoid duplicates
*/
latest?: boolean
}
/**
* Response from the server when listing artifacts
*/
export interface ListArtifactsResponse {
/**
* A list of artifacts that were found
@ -70,34 +83,48 @@ export interface ListArtifactsResponse {
artifacts: Artifact[]
}
/*****************************************************************************
* *
* DownloadArtifact *
* *
*****************************************************************************/
/**
* Response from the server when downloading an artifact
*/
export interface DownloadArtifactResponse {
/**
* If the artifact download was successful
*/
success: boolean
/**
* The path where the artifact was downloaded to
*/
downloadPath?: string
/**
* Returns true if the digest of the downloaded artifact does not match the expected hash
*/
digestMismatch?: boolean
}
/**
* Options for downloading an artifact
*/
export interface DownloadArtifactOptions {
/**
* Denotes where the artifact will be downloaded to. If not specified then the artifact is download to GITHUB_WORKSPACE
*/
path?: string
/**
* The hash that was computed for the artifact during upload. If provided, the outcome of the download
* will provide a digestMismatch property indicating whether the hash of the downloaded artifact
* matches the expected hash.
*/
expectedHash?: string
}
/*****************************************************************************
* *
* Shared *
* *
*****************************************************************************/
export interface StreamExtractResponse {
/**
* The SHA256 hash of the downloaded file
*/
sha256Digest?: string
}
/**
* An Actions Artifact
*/
export interface Artifact {
/**
* The name of the artifact
@ -109,13 +136,53 @@ export interface Artifact {
*/
id: number
/**
* The URL of the artifact
*/
url: string
/**
* The size of the artifact in bytes
*/
size: number
/**
* The time when the artifact was created
*/
createdAt?: Date
/**
* The digest of the artifact, computed at time of upload.
*/
digest?: string
}
// FindOptions are for fetching Artifact(s) out of the scope of the current run.
export interface FindOptions {
/**
* The criteria for finding Artifact(s) out of the scope of the current run.
*/
findBy?: {
/**
* Token with actions:read permissions
*/
token: string
/**
* WorkflowRun of the artifact(s) to lookup
*/
workflowRunId: number
/**
* Repository owner (eg. 'actions')
*/
repositoryOwner: string
/**
* Repository owner (eg. 'toolkit')
*/
repositoryName: string
}
}
/**
* Response from the server when deleting an artifact
*/
export interface DeleteArtifactResponse {
/**
* The id of the artifact that was deleted
*/
id: number
}

View File

@ -1,5 +1,7 @@
import * as core from '@actions/core'
import {getRuntimeToken} from './config'
import jwt_decode from 'jwt-decode'
import {debug, setSecret} from '@actions/core'
export interface BackendIds {
workflowRunBackendId: string
@ -11,7 +13,7 @@ interface ActionsToken {
}
const InvalidJwtError = new Error(
'Failed to get backend IDs: The provided JWT token is invalid'
'Failed to get backend IDs: The provided JWT token is invalid and/or missing claims'
)
// uses the JWT token claims to get the
@ -41,25 +43,103 @@ export function getBackendIdsFromToken(): BackendIds {
for (const scopes of scpParts) {
const scopeParts = scopes.split(':')
if (scopeParts?.[0] !== 'Actions.Results') {
// not the Actions.Results scope
continue
}
/*
* example scopeParts:
* ["Actions.Results", "ce7f54c7-61c7-4aae-887f-30da475f5f1a", "ca395085-040a-526b-2ce8-bdc85f692774"]
*/
if (scopeParts.length !== 3) {
// not the Actions.Results scope
continue
// missing expected number of claims
throw InvalidJwtError
}
if (scopeParts[0] !== 'Actions.Results') {
// not the Actions.Results scope
continue
}
return {
const ids = {
workflowRunBackendId: scopeParts[1],
workflowJobRunBackendId: scopeParts[2]
}
core.debug(`Workflow Run Backend ID: ${ids.workflowRunBackendId}`)
core.debug(`Workflow Job Run Backend ID: ${ids.workflowJobRunBackendId}`)
return ids
}
throw InvalidJwtError
}
/**
* Masks the `sig` parameter in a URL and sets it as a secret.
*
* @param url - The URL containing the signature parameter to mask
* @remarks
* This function attempts to parse the provided URL and identify the 'sig' query parameter.
* If found, it registers both the raw and URL-encoded signature values as secrets using
* the Actions `setSecret` API, which prevents them from being displayed in logs.
*
* The function handles errors gracefully if URL parsing fails, logging them as debug messages.
*
* @example
* ```typescript
* // Mask a signature in an Azure SAS token URL
* maskSigUrl('https://example.blob.core.windows.net/container/file.txt?sig=abc123&se=2023-01-01');
* ```
*/
export function maskSigUrl(url: string): void {
if (!url) return
try {
const parsedUrl = new URL(url)
const signature = parsedUrl.searchParams.get('sig')
if (signature) {
setSecret(signature)
setSecret(encodeURIComponent(signature))
}
} catch (error) {
debug(
`Failed to parse URL: ${url} ${
error instanceof Error ? error.message : String(error)
}`
)
}
}
/**
* Masks sensitive information in URLs containing signature parameters.
* Currently supports masking 'sig' parameters in the 'signed_upload_url'
* and 'signed_download_url' properties of the provided object.
*
* @param body - The object should contain a signature
* @remarks
* This function extracts URLs from the object properties and calls maskSigUrl
* on each one to redact sensitive signature information. The function doesn't
* modify the original object; it only marks the signatures as secrets for
* logging purposes.
*
* @example
* ```typescript
* const responseBody = {
* signed_upload_url: 'https://example.com?sig=abc123',
* signed_download_url: 'https://example.com?sig=def456'
* };
* maskSecretUrls(responseBody);
* ```
*/
export function maskSecretUrls(body: Record<string, unknown> | null): void {
if (typeof body !== 'object' || body === null) {
debug('body is not an object or is null')
return
}
if (
'signed_upload_url' in body &&
typeof body.signed_upload_url === 'string'
) {
maskSigUrl(body.signed_upload_url)
}
if ('signed_url' in body && typeof body.signed_url === 'string') {
maskSigUrl(body.signed_url)
}
}

View File

@ -1,26 +1,26 @@
import {BlobClient, BlockBlobUploadStreamOptions} from '@azure/storage-blob'
import {TransferProgressEvent} from '@azure/core-http'
import {TransferProgressEvent} from '@azure/core-http-compat'
import {ZipUploadStream} from './zip'
import {getUploadChunkSize} from '../shared/config'
import {
getUploadChunkSize,
getConcurrency,
getUploadChunkTimeout
} from '../shared/config'
import * as core from '@actions/core'
import * as crypto from 'crypto'
import * as stream from 'stream'
import {NetworkError} from '../shared/errors'
export interface BlobUploadResponse {
/**
* If the upload was successful or not
*/
isSuccess: boolean
/**
* The total reported upload size in bytes. Empty if the upload failed
*/
uploadSize?: number
/**
* The MD5 hash of the uploaded file. Empty if the upload failed
* The SHA256 hash of the uploaded file. Empty if the upload failed
*/
md5Hash?: string
sha256Hash?: string
}
export async function uploadZipToBlobStorage(
@ -28,69 +28,85 @@ export async function uploadZipToBlobStorage(
zipUploadStream: ZipUploadStream
): Promise<BlobUploadResponse> {
let uploadByteCount = 0
let lastProgressTime = Date.now()
const abortController = new AbortController()
const maxBuffers = 5
const chunkTimer = async (interval: number): Promise<void> =>
new Promise((resolve, reject) => {
const timer = setInterval(() => {
if (Date.now() - lastProgressTime > interval) {
reject(new Error('Upload progress stalled.'))
}
}, interval)
abortController.signal.addEventListener('abort', () => {
clearInterval(timer)
resolve()
})
})
const maxConcurrency = getConcurrency()
const bufferSize = getUploadChunkSize()
const blobClient = new BlobClient(authenticatedUploadURL)
const blockBlobClient = blobClient.getBlockBlobClient()
core.debug(
`Uploading artifact zip to blob storage with maxBuffers: ${maxBuffers}, bufferSize: ${bufferSize}`
`Uploading artifact zip to blob storage with maxConcurrency: ${maxConcurrency}, bufferSize: ${bufferSize}`
)
const uploadCallback = (progress: TransferProgressEvent): void => {
core.info(`Uploaded bytes ${progress.loadedBytes}`)
uploadByteCount = progress.loadedBytes
lastProgressTime = Date.now()
}
const options: BlockBlobUploadStreamOptions = {
blobHTTPHeaders: {blobContentType: 'zip'},
onProgress: uploadCallback
onProgress: uploadCallback,
abortSignal: abortController.signal
}
let md5Hash: string | undefined = undefined
let sha256Hash: string | undefined = undefined
const uploadStream = new stream.PassThrough()
const hashStream = crypto.createHash('md5')
const hashStream = crypto.createHash('sha256')
zipUploadStream.pipe(uploadStream) // This stream is used for the upload
zipUploadStream.pipe(hashStream).setEncoding('hex') // This stream is used to compute a hash of the zip content that gets used. Integrity check
core.info('Beginning upload of artifact content to blob storage')
try {
core.info('Beginning upload of artifact content to blob storage')
await blockBlobClient.uploadStream(
uploadStream,
bufferSize,
maxBuffers,
options
)
core.info('Finished uploading artifact content to blob storage!')
hashStream.end()
md5Hash = hashStream.read() as string
core.info(`MD5 hash of uploaded artifact zip is ${md5Hash}`)
await Promise.race([
blockBlobClient.uploadStream(
uploadStream,
bufferSize,
maxConcurrency,
options
),
chunkTimer(getUploadChunkTimeout())
])
} catch (error) {
core.warning(
`Failed to upload artifact zip to blob storage, error: ${error}`
)
return {
isSuccess: false
if (NetworkError.isNetworkErrorCode(error?.code)) {
throw new NetworkError(error?.code)
}
throw error
} finally {
abortController.abort()
}
core.info('Finished uploading artifact content to blob storage!')
hashStream.end()
sha256Hash = hashStream.read() as string
core.info(`SHA256 digest of uploaded artifact zip is ${sha256Hash}`)
if (uploadByteCount === 0) {
core.warning(
`No data was uploaded to blob storage. Reported upload byte count is 0`
`No data was uploaded to blob storage. Reported upload byte count is 0.`
)
return {
isSuccess: false
}
}
return {
isSuccess: true,
uploadSize: uploadByteCount,
md5Hash
sha256Hash
}
}

View File

@ -1,8 +1,11 @@
import * as core from '@actions/core'
import {UploadOptions, UploadResponse} from '../shared/interfaces'
import {
UploadArtifactOptions,
UploadArtifactResponse
} from '../shared/interfaces'
import {getExpiration} from './retention'
import {validateArtifactName} from './path-and-artifact-name-validation'
import {createArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {internalArtifactTwirpClient} from '../shared/artifact-twirp-client'
import {
UploadZipSpecification,
getUploadZipSpecification,
@ -16,13 +19,14 @@ import {
FinalizeArtifactRequest,
StringValue
} from '../../generated'
import {FilesNotFoundError, InvalidResponseError} from '../shared/errors'
export async function uploadArtifact(
name: string,
files: string[],
rootDirectory: string,
options?: UploadOptions | undefined
): Promise<UploadResponse> {
options?: UploadArtifactOptions | undefined
): Promise<UploadArtifactResponse> {
validateArtifactName(name)
validateRootDirectory(rootDirectory)
@ -31,31 +35,16 @@ export async function uploadArtifact(
rootDirectory
)
if (zipSpecification.length === 0) {
core.warning(`No files were found to upload`)
return {
success: false
}
throw new FilesNotFoundError(
zipSpecification.flatMap(s => (s.sourcePath ? [s.sourcePath] : []))
)
}
const zipUploadStream = await createZipUploadStream(zipSpecification)
// get the IDs needed for the artifact creation
const backendIds = getBackendIdsFromToken()
if (!backendIds.workflowRunBackendId || !backendIds.workflowJobRunBackendId) {
core.warning(
`Failed to get the necessary backend ids which are required to create the artifact`
)
return {
success: false
}
}
core.debug(`Workflow Run Backend ID: ${backendIds.workflowRunBackendId}`)
core.debug(
`Workflow Job Run Backend ID: ${backendIds.workflowJobRunBackendId}`
)
// create the artifact client
const artifactClient = createArtifactTwirpClient('upload')
const artifactClient = internalArtifactTwirpClient()
// create the artifact
const createArtifactReq: CreateArtifactRequest = {
@ -71,26 +60,24 @@ export async function uploadArtifact(
createArtifactReq.expiresAt = expiresAt
}
const createArtifactResp = await artifactClient.CreateArtifact(
createArtifactReq
)
const createArtifactResp =
await artifactClient.CreateArtifact(createArtifactReq)
if (!createArtifactResp.ok) {
core.warning(`Failed to create artifact`)
return {
success: false
}
throw new InvalidResponseError(
'CreateArtifact: response from backend was not ok'
)
}
const zipUploadStream = await createZipUploadStream(
zipSpecification,
options?.compressionLevel
)
// Upload zip to blob storage
const uploadResult = await uploadZipToBlobStorage(
createArtifactResp.signedUploadUrl,
zipUploadStream
)
if (uploadResult.isSuccess === false) {
return {
success: false
}
}
// finalize the artifact
const finalizeArtifactReq: FinalizeArtifactRequest = {
@ -100,22 +87,20 @@ export async function uploadArtifact(
size: uploadResult.uploadSize ? uploadResult.uploadSize.toString() : '0'
}
if (uploadResult.md5Hash) {
if (uploadResult.sha256Hash) {
finalizeArtifactReq.hash = StringValue.create({
value: `md5:${uploadResult.md5Hash}`
value: `sha256:${uploadResult.sha256Hash}`
})
}
core.info(`Finalizing artifact upload`)
const finalizeArtifactResp = await artifactClient.FinalizeArtifact(
finalizeArtifactReq
)
const finalizeArtifactResp =
await artifactClient.FinalizeArtifact(finalizeArtifactReq)
if (!finalizeArtifactResp.ok) {
core.warning(`Failed to finalize artifact`)
return {
success: false
}
throw new InvalidResponseError(
'FinalizeArtifact: response from backend was not ok'
)
}
const artifactId = BigInt(finalizeArtifactResp.artifactId)
@ -124,8 +109,8 @@ export async function uploadArtifact(
)
return {
success: true,
size: uploadResult.uploadSize,
digest: uploadResult.sha256Hash,
id: Number(artifactId)
}
}

View File

@ -13,6 +13,12 @@ export interface UploadZipSpecification {
* The destination path in a zip for a file
*/
destinationPath: string
/**
* Information about the file
* https://nodejs.org/api/fs.html#class-fsstats
*/
stats: fs.Stats
}
/**
@ -75,10 +81,11 @@ export function getUploadZipSpecification(
- file3.txt
*/
for (let file of filesToZip) {
if (!fs.existsSync(file)) {
const stats = fs.lstatSync(file, {throwIfNoEntry: false})
if (!stats) {
throw new Error(`File ${file} does not exist`)
}
if (!fs.statSync(file).isDirectory()) {
if (!stats.isDirectory()) {
// Normalize and resolve, this allows for either absolute or relative paths to be used
file = normalize(file)
file = resolve(file)
@ -94,7 +101,8 @@ export function getUploadZipSpecification(
specification.push({
sourcePath: file,
destinationPath: uploadPath
destinationPath: uploadPath,
stats
})
} else {
// Empty directory
@ -103,7 +111,8 @@ export function getUploadZipSpecification(
specification.push({
sourcePath: null,
destinationPath: directoryPath
destinationPath: directoryPath,
stats
})
}
}

View File

@ -1,10 +1,12 @@
import * as stream from 'stream'
import {realpath} from 'fs/promises'
import * as archiver from 'archiver'
import * as core from '@actions/core'
import {createReadStream} from 'fs'
import {UploadZipSpecification} from './upload-zip-specification'
import {getUploadChunkSize} from '../shared/config'
export const DEFAULT_COMPRESSION_LEVEL = 6
// Custom stream transformer so we can set the highWaterMark property
// See https://github.com/nodejs/node/issues/8855
export class ZipUploadStream extends stream.Transform {
@ -21,14 +23,16 @@ export class ZipUploadStream extends stream.Transform {
}
export async function createZipUploadStream(
uploadSpecification: UploadZipSpecification[]
uploadSpecification: UploadZipSpecification[],
compressionLevel: number = DEFAULT_COMPRESSION_LEVEL
): Promise<ZipUploadStream> {
core.debug(
`Creating Artifact archive with compressionLevel: ${compressionLevel}`
)
const zip = archiver.create('zip', {
zlib: {level: 9} // Sets the compression level.
// Available options are 0-9
// 0 => no compression
// 1 => fastest with low compression
// 9 => highest compression ratio but the slowest
highWaterMark: getUploadChunkSize(),
zlib: {level: compressionLevel}
})
// register callbacks for various events during the zip lifecycle
@ -39,8 +43,14 @@ export async function createZipUploadStream(
for (const file of uploadSpecification) {
if (file.sourcePath !== null) {
// Add a normal file to the zip
zip.append(createReadStream(file.sourcePath), {
// Check if symlink and resolve the source path
let sourcePath = file.sourcePath
if (file.stats.isSymbolicLink()) {
sourcePath = await realpath(file.sourcePath)
}
// Add the file to the zip
zip.file(sourcePath, {
name: file.destinationPath
})
} else {

View File

@ -0,0 +1,9 @@
The MIT License (MIT)
Copyright 2024 GitHub
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

191
packages/attest/README.md Normal file
View File

@ -0,0 +1,191 @@
# `@actions/attest`
Functions for generating signed attestations for workflow artifacts.
Attestations bind some subject (a named artifact along with its digest) to a
predicate (some assertion about that subject) using the [in-toto
statement](https://github.com/in-toto/attestation/tree/main/spec/v1) format. A
signature is generated for the attestation using a
[Sigstore](https://www.sigstore.dev/)-issued signing certificate.
Once the attestation has been created and signed, it will be uploaded to the GH
attestations API and associated with the repository from which the workflow was
initiated.
See [Using artifact attestations to establish provenance for builds](https://docs.github.com/en/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds)
for more information on artifact attestations.
## Usage
### `attest`
The `attest` function takes the supplied subject/predicate pair and generates a
signed attestation.
```js
const { attest } = require('@actions/attest');
const core = require('@actions/core');
async function run() {
// In order to persist attestations to the repo, this should be a token with
// repository write permissions.
const ghToken = core.getInput('gh-token');
const attestation = await attest({
subjects: [{name: 'my-artifact-name', digest: { 'sha256': '36ab4667...'}}],
predicateType: 'https://in-toto.io/attestation/release',
predicate: { . . . },
token: ghToken
});
console.log(attestation);
}
run();
```
The `attest` function supports the following options:
```typescript
export type AttestOptions = {
// Deprecated. Use 'subjects' instead.
subjectName?: string
// Deprecated. Use 'subjects' instead.
subjectDigest?: Record<string, string>
// Collection of subjects to be attested
subjects?: Subject[]
// URI identifying the content type of the predicate being attested.
predicateType: string
// Predicate to be attested.
predicate: object
// GitHub token for writing attestations.
token: string
// Sigstore instance to use for signing. Must be one of "public-good" or
// "github".
sigstore?: 'public-good' | 'github'
// HTTP headers to include in request to attestations API.
headers?: {[header: string]: string | number | undefined}
// Whether to skip writing the attestation to the GH attestations API.
skipWrite?: boolean
}
export type Subject = {
// Name of the subject.
name: string
// Digests of the subject. Should be a map of digest algorithms to their hex-encoded values.
digest: Record<string, string>
}
```
### `attestProvenance`
The `attestProvenance` function accepts the name and digest of some artifact and
generates a build provenance attestation over those values.
The attestation is formed by first generating a [SLSA provenance
predicate](https://slsa.dev/spec/v1.0/provenance) populated with
[metadata](https://github.com/slsa-framework/github-actions-buildtypes/tree/main/workflow/v1)
pulled from the GitHub Actions run.
```js
const { attestProvenance } = require('@actions/attest');
const core = require('@actions/core');
async function run() {
// In order to persist attestations to the repo, this should be a token with
// repository write permissions.
const ghToken = core.getInput('gh-token');
const attestation = await attestProvenance({
subjectName: 'my-artifact-name',
subjectDigest: { 'sha256': '36ab4667...'},
token: ghToken
});
console.log(attestation);
}
run();
```
The `attestProvenance` function supports the following options:
```typescript
export type AttestProvenanceOptions = {
// Deprecated. Use 'subjects' instead.
subjectName?: string
// Deprecated. Use 'subjects' instead.
subjectDigest?: Record<string, string>
// Collection of subjects to be attested
subjects?: Subject[]
// URI identifying the content type of the predicate being attested.
token: string
// Sigstore instance to use for signing. Must be one of "public-good" or
// "github".
sigstore?: 'public-good' | 'github'
// HTTP headers to include in request to attestations API.
headers?: {[header: string]: string | number | undefined}
// Whether to skip writing the attestation to the GH attestations API.
skipWrite?: boolean
// Issuer URL responsible for minting the OIDC token from which the
// provenance data is read. Defaults to
// 'https://token.actions.githubusercontent.com".
issuer?: string
}
```
### `Attestation`
The `Attestation` returned by `attest`/`attestProvenance` has the following
fields:
```typescript
export type Attestation = {
/*
* JSON-serialized Sigstore bundle containing the provenance attestation,
* signature, signing certificate and witnessed timestamp.
*/
bundle: SerializedBundle
/*
* PEM-encoded signing certificate used to sign the attestation.
*/
certificate: string
/*
* ID of Rekor transparency log entry created for the attestation (if
* applicable).
*/
tlogID?: string
/*
* ID of the persisted attestation (accessible via the GH API).
*/
attestationID?: string
}
```
For details about the Sigstore bundle format, see the [Bundle protobuf
specification](https://github.com/sigstore/protobuf-specs/blob/main/protos/sigstore_bundle.proto).
## Sigstore Instance
When generating the signed attestation there are two different Sigstore
instances which can be used to issue the signing certificate. By default,
workflows initiated from public repositories will use the Sigstore public-good
instance and persist the attestation signature to the public [Rekor transparency
log](https://docs.sigstore.dev/logging/overview/). Workflows initiated from
private/internal repositories will use the GitHub-internal Sigstore instance
which uses a signed timestamp issued by GitHub's timestamp authority in place of
the public transparency log.
The default Sigstore instance selection can be overridden by passing an explicit
value of either "public-good" or "github" for the `sigstore` option when calling
either `attest` or `attestProvenance`.
## Storage
Attestations created by `attest`/`attestProvenance` will be uploaded to the GH
attestations API and associated with the appropriate repository. Attestation
storage is only supported for public repositories or repositories which belong
to a GitHub Enterprise Cloud account.
In order to generate attestations for private, non-Enterprise repositories, the
`skipWrite` option should be set to `true`.

View File

@ -0,0 +1,63 @@
# @actions/attest Releases
### 2.0.0
- Add support for Node 24 [#2110](https://github.com/actions/toolkit/pull/2110)
- Bump @sigstore/bundle from 3.0.0 to 3.1.0
- Bump @sigstore/sign from 3.0.0 to 3.1.0
- Bump jose from 5.2.3 to 5.10.0
### 1.6.0
- Update `buildSLSAProvenancePredicate` to populate `workflow.ref` field from the `ref` claim in the OIDC token [#1969](https://github.com/actions/toolkit/pull/1969)
### 1.5.0
- Bump @actions/core from 1.10.1 to 1.11.1 [#1847](https://github.com/actions/toolkit/pull/1847)
- Bump @sigstore/bundle from 2.3.2 to 3.0.0 [#1846](https://github.com/actions/toolkit/pull/1846)
- Bump @sigstore/sign from 2.3.2 to 3.0.0 [#1846](https://github.com/actions/toolkit/pull/1846)
- Support for generating multi-subject attestations [#1864](https://github.com/actions/toolkit/pull/1865)
- Fix bug in `buildSLSAProvenancePredicate` related to `workflow_ref` OIDC token claims containing the "@" symbol in the tag name [#1863](https://github.com/actions/toolkit/pull/1863)
### 1.4.2
- Fix bug in `buildSLSAProvenancePredicate`/`attestProvenance` when generating provenance statement for enterprise account using customized OIDC issuer value [#1823](https://github.com/actions/toolkit/pull/1823)
### 1.4.1
- Bump @actions/http-client from 2.2.1 to 2.2.3 [#1805](https://github.com/actions/toolkit/pull/1805)
### 1.4.0
- Add new `headers` parameter to the `attest` and `attestProvenance` functions [#1790](https://github.com/actions/toolkit/pull/1790)
- Update `buildSLSAProvenancePredicate`/`attestProvenance` to automatically derive default OIDC issuer URL from current execution context [#1796](https://github.com/actions/toolkit/pull/1796)
### 1.3.1
- Fix bug with proxy support when retrieving JWKS for OIDC issuer [#1776](https://github.com/actions/toolkit/pull/1776)
### 1.3.0
- Dynamic construction of Sigstore API URLs [#1735](https://github.com/actions/toolkit/pull/1735)
- Switch to new GH provenance build type [#1745](https://github.com/actions/toolkit/pull/1745)
- Fetch existing Rekor entry on 409 conflict error [#1759](https://github.com/actions/toolkit/pull/1759)
- Bump @sigstore/bundle from 2.3.0 to 2.3.2 [#1738](https://github.com/actions/toolkit/pull/1738)
- Bump @sigstore/sign from 2.3.0 to 2.3.2 [#1738](https://github.com/actions/toolkit/pull/1738)
### 1.2.1
- Retry request on attestation persistence failure [#1725](https://github.com/actions/toolkit/pull/1725)
### 1.2.0
- Generate attestations using the v0.3 Sigstore bundle format [#1701](https://github.com/actions/toolkit/pull/1701)
- Bump @sigstore/bundle from 2.2.0 to 2.3.0 [#1701](https://github.com/actions/toolkit/pull/1701)
- Bump @sigstore/sign from 2.2.3 to 2.3.0 [#1701](https://github.com/actions/toolkit/pull/1701)
- Remove dependency on make-fetch-happen [#1714](https://github.com/actions/toolkit/pull/1714)
### 1.1.0
- Updates the `attestProvenance` function to retrieve a token from the GitHub OIDC provider and use the token claims to populate the provenance statement [#1693](https://github.com/actions/toolkit/pull/1693)
### 1.0.0
- Initial release

View File

@ -0,0 +1,19 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`buildIntotoStatement returns an intoto statement 1`] = `
{
"_type": "https://in-toto.io/Statement/v1",
"predicate": {
"key": "value",
},
"predicateType": "predicatey",
"subject": [
{
"digest": {
"sha256": "7d070f6b64d9bcc530fe99cc21eaaa4b3c364e0b2d367d7735671fa202a03b32",
},
"name": "subjecty",
},
],
}
`;

View File

@ -0,0 +1,43 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`provenance functions buildSLSAProvenancePredicate returns a provenance hydrated from an OIDC token 1`] = `
{
"params": {
"buildDefinition": {
"buildType": "https://actions.github.io/buildtypes/workflow/v1",
"externalParameters": {
"workflow": {
"path": ".github/workflows/main.yml",
"ref": "refs/heads/main",
"repository": "https://foo.ghe.com/owner/repo",
},
},
"internalParameters": {
"github": {
"event_name": "push",
"repository_id": "repo-id",
"repository_owner_id": "owner-id",
"runner_environment": "github-hosted",
},
},
"resolvedDependencies": [
{
"digest": {
"gitCommit": "babca52ab0c93ae16539e5923cb0d7403b9a093b",
},
"uri": "git+https://foo.ghe.com/owner/repo@refs/heads/main",
},
],
},
"runDetails": {
"builder": {
"id": "https://foo.ghe.com/owner/workflows/.github/workflows/publish.yml@main",
},
"metadata": {
"invocationId": "https://foo.ghe.com/owner/repo/actions/runs/run-id/attempts/run-attempt",
},
},
},
"type": "https://slsa.dev/provenance/v1",
}
`;

View File

@ -0,0 +1,16 @@
import {attest} from '../src/attest'
describe('attest', () => {
describe('when no subject information is provided', () => {
it('throws an error', async () => {
const options = {
predicateType: 'foo',
predicate: {bar: 'baz'},
token: 'token'
}
expect(attest(options)).rejects.toThrowError(
'Must provide either subjectName and subjectDigest or subjects'
)
})
})
})

View File

@ -0,0 +1,41 @@
import {signingEndpoints} from '../src/endpoints'
describe('signingEndpoints', () => {
const originalEnv = process.env
afterEach(() => {
process.env = originalEnv
})
describe('when using github.com', () => {
beforeEach(async () => {
process.env = {
...originalEnv,
GITHUB_SERVER_URL: 'https://github.com'
}
})
it('returns expected endpoints', async () => {
const endpoints = signingEndpoints('github')
expect(endpoints.fulcioURL).toEqual('https://fulcio.githubapp.com')
expect(endpoints.tsaServerURL).toEqual('https://timestamp.githubapp.com')
})
})
describe('when using custom domain', () => {
beforeEach(async () => {
process.env = {
...originalEnv,
GITHUB_SERVER_URL: 'https://foo.bar.com'
}
})
it('returns a expected endpoints', async () => {
const endpoints = signingEndpoints('github')
expect(endpoints.fulcioURL).toEqual('https://fulcio.foo.bar.com')
expect(endpoints.tsaServerURL).toEqual('https://timestamp.foo.bar.com')
})
})
})

View File

@ -0,0 +1,6 @@
import {attest, attestProvenance} from '../src'
it('exports functions', () => {
expect(attestProvenance).toBeInstanceOf(Function)
expect(attest).toBeInstanceOf(Function)
})

View File

@ -0,0 +1,23 @@
import {buildIntotoStatement} from '../src/intoto'
import type {Predicate, Subject} from '../src/shared.types'
describe('buildIntotoStatement', () => {
const subject: Subject = {
name: 'subjecty',
digest: {
sha256: '7d070f6b64d9bcc530fe99cc21eaaa4b3c364e0b2d367d7735671fa202a03b32'
}
}
const predicate: Predicate = {
type: 'predicatey',
params: {
key: 'value'
}
}
it('returns an intoto statement', () => {
const statement = buildIntotoStatement([subject], predicate)
expect(statement).toMatchSnapshot()
})
})

View File

@ -0,0 +1,199 @@
import * as jose from 'jose'
import nock from 'nock'
import {getIDTokenClaims} from '../src/oidc'
describe('getIDTokenClaims', () => {
const originalEnv = process.env
const issuer = 'https://example.com'
const audience = 'nobody'
const requestToken = 'token'
const openidConfigPath = '/.well-known/openid-configuration'
const jwksPath = '/.well-known/jwks.json'
const tokenPath = '/token'
const openIDConfig = {jwks_uri: `${issuer}${jwksPath}`}
/* eslint-disable-next-line @typescript-eslint/no-explicit-any */
let key: any
beforeEach(async () => {
process.env = {
...originalEnv,
ACTIONS_ID_TOKEN_REQUEST_URL: `${issuer}${tokenPath}?`,
ACTIONS_ID_TOKEN_REQUEST_TOKEN: requestToken
}
// Generate JWT signing key
key = await jose.generateKeyPair('PS256')
// Create JWK and JWKS
const jwk = await jose.exportJWK(key.publicKey)
const jwks = {keys: [jwk]}
nock(issuer).get(openidConfigPath).reply(200, openIDConfig)
nock(issuer).get(jwksPath).reply(200, jwks)
})
afterEach(() => {
process.env = originalEnv
})
describe('when ID token is valid', () => {
const claims = {
iss: issuer,
aud: audience,
ref: 'ref',
sha: 'sha',
repository: 'repo',
event_name: 'push',
job_workflow_ref: 'job_workflow_ref',
workflow_ref: 'workflow',
repository_id: '1',
repository_owner_id: '1',
runner_environment: 'github-hosted',
run_id: '1',
run_attempt: '1'
}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('returns the ID token claims', async () => {
const result = await getIDTokenClaims(issuer)
expect(result).toEqual(claims)
})
})
describe('when ID token is valid (w/ enterprise slug)', () => {
const claims = {
iss: `${issuer}/foo-bar`,
aud: audience,
ref: 'ref',
sha: 'sha',
repository: 'repo',
event_name: 'push',
job_workflow_ref: 'job_workflow_ref',
workflow_ref: 'workflow',
repository_id: '1',
repository_owner_id: '1',
runner_environment: 'github-hosted',
run_id: '1',
run_attempt: '1'
}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('returns the ID token claims', async () => {
const result = await getIDTokenClaims(issuer)
expect(result).toEqual(claims)
})
})
describe('when ID token is missing the "iss" claim', () => {
const claims = {
aud: audience
}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('throws an error', async () => {
await expect(getIDTokenClaims(issuer)).rejects.toThrow(/missing "iss"/i)
})
})
describe('when ID token is missing required claims', () => {
const claims = {
iss: issuer,
aud: audience
}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('throws an error', async () => {
await expect(getIDTokenClaims(issuer)).rejects.toThrow(/missing claims/i)
})
})
describe('when ID has the wrong issuer', () => {
const claims = {foo: 'bar', iss: 'foo', aud: 'nobody'}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('throws an error', async () => {
await expect(getIDTokenClaims(issuer)).rejects.toThrow(
/unexpected "iss"/i
)
})
})
describe('when ID has the wrong audience', () => {
const claims = {foo: 'bar', iss: issuer, aud: 'bar'}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
})
it('throw an error', async () => {
await expect(getIDTokenClaims(issuer)).rejects.toThrow(/unexpected "aud"/)
})
})
describe('when openid config cannot be retrieved', () => {
const claims = {foo: 'bar', iss: issuer, aud: 'nobody'}
beforeEach(async () => {
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
// Disable the openid config endpoint
nock.removeInterceptor({
proto: 'https',
hostname: 'example.com',
port: '443',
method: 'GET',
path: openidConfigPath
})
})
it('throws an error', async () => {
await expect(getIDTokenClaims(issuer)).rejects.toThrow(
/failed to get id/i
)
})
})
})

View File

@ -0,0 +1,248 @@
import * as github from '@actions/github'
import {mockFulcio, mockRekor, mockTSA} from '@sigstore/mock'
import * as jose from 'jose'
import nock from 'nock'
import {MockAgent, setGlobalDispatcher} from 'undici'
import {SIGSTORE_PUBLIC_GOOD, signingEndpoints} from '../src/endpoints'
import {attestProvenance, buildSLSAProvenancePredicate} from '../src/provenance'
describe('provenance functions', () => {
const originalEnv = process.env
const issuer = 'https://token.actions.foo.ghe.com'
const audience = 'nobody'
const jwksPath = '/.well-known/jwks.json'
const tokenPath = '/token'
// MockAgent for mocking @actions/github
const mockAgent = new MockAgent()
setGlobalDispatcher(mockAgent)
const claims = {
iss: issuer,
aud: 'nobody',
repository: 'owner/repo',
ref: 'refs/heads/main',
sha: 'babca52ab0c93ae16539e5923cb0d7403b9a093b',
job_workflow_ref: 'owner/workflows/.github/workflows/publish.yml@main',
workflow_ref: 'owner/repo/.github/workflows/main.yml@main',
event_name: 'push',
repository_id: 'repo-id',
repository_owner_id: 'owner-id',
run_id: 'run-id',
run_attempt: 'run-attempt',
runner_environment: 'github-hosted'
}
const mockIssuer = async (claims: jose.JWTPayload): Promise<void> => {
// Generate JWT signing key
const key = await jose.generateKeyPair('PS256')
// Create JWK, JWKS, and JWT
const jwk = await jose.exportJWK(key.publicKey)
const jwks = {keys: [jwk]}
const jwt = await new jose.SignJWT(claims)
.setProtectedHeader({alg: 'PS256'})
.sign(key.privateKey)
// Mock OpenID configuration and JWKS endpoints
nock(issuer)
.get('/.well-known/openid-configuration')
.reply(200, {jwks_uri: `${issuer}${jwksPath}`})
nock(issuer).get(jwksPath).reply(200, jwks)
// Mock OIDC token endpoint for populating the provenance
nock(issuer).get(tokenPath).query({audience}).reply(200, {value: jwt})
}
beforeEach(async () => {
process.env = {
...originalEnv,
ACTIONS_ID_TOKEN_REQUEST_URL: `${issuer}${tokenPath}?`,
ACTIONS_ID_TOKEN_REQUEST_TOKEN: 'token',
GITHUB_SERVER_URL: 'https://foo.ghe.com',
GITHUB_REPOSITORY: claims.repository
}
await mockIssuer(claims)
})
afterEach(() => {
process.env = originalEnv
})
describe('buildSLSAProvenancePredicate', () => {
it('returns a provenance hydrated from an OIDC token', async () => {
const predicate = await buildSLSAProvenancePredicate()
expect(predicate).toMatchSnapshot()
})
})
describe('attestProvenance', () => {
// Subject to attest
const subjectName = 'subjective'
const subjectDigest = {
sha256: '7d070f6b64d9bcc530fe99cc21eaaa4b3c364e0b2d367d7735671fa202a03b32'
}
// Fake an OIDC token
const oidcPayload = {sub: 'foo@bar.com', iss: ''}
const oidcToken = `.${Buffer.from(JSON.stringify(oidcPayload)).toString(
'base64'
)}.}`
const attestationID = '1234567890'
beforeEach(async () => {
nock(issuer)
.get(tokenPath)
.query({audience: 'sigstore'})
.reply(200, {value: oidcToken})
})
describe('when using the github Sigstore instance', () => {
beforeEach(async () => {
const {fulcioURL, tsaServerURL} = signingEndpoints('github')
// Mock Sigstore
await mockFulcio({baseURL: fulcioURL, strict: false})
await mockTSA({baseURL: tsaServerURL})
mockAgent
.get('https://api.github.com')
.intercept({
path: /^\/repos\/.*\/.*\/attestations$/,
method: 'post'
})
.reply(201, {id: attestationID})
})
describe('when the sigstore instance is explicitly set', () => {
it('attests provenance', async () => {
const attestation = await attestProvenance({
subjects: [{name: subjectName, digest: subjectDigest}],
token: 'token',
sigstore: 'github'
})
expect(attestation).toBeDefined()
expect(attestation.bundle).toBeDefined()
expect(attestation.certificate).toMatch(/-----BEGIN CERTIFICATE-----/)
expect(attestation.tlogID).toBeUndefined()
expect(attestation.attestationID).toBe(attestationID)
})
})
describe('when the sigstore instance is inferred from the repo visibility', () => {
const savedRepository = github.context.payload.repository
beforeEach(() => {
/* eslint-disable-next-line @typescript-eslint/no-explicit-any */
github.context.payload.repository = {visibility: 'private'} as any
})
afterEach(() => {
github.context.payload.repository = savedRepository
})
it('attests provenance', async () => {
const attestation = await attestProvenance({
subjects: [{name: subjectName, digest: subjectDigest}],
token: 'token'
})
expect(attestation).toBeDefined()
expect(attestation.bundle).toBeDefined()
expect(attestation.certificate).toMatch(/-----BEGIN CERTIFICATE-----/)
expect(attestation.tlogID).toBeUndefined()
expect(attestation.attestationID).toBe(attestationID)
})
})
})
describe('when using the public-good Sigstore instance', () => {
const {fulcioURL, rekorURL} = SIGSTORE_PUBLIC_GOOD
beforeEach(async () => {
// Mock Sigstore
await mockFulcio({baseURL: fulcioURL, strict: false})
await mockRekor({baseURL: rekorURL})
// Mock GH attestations API
mockAgent
.get('https://api.github.com')
.intercept({
path: /^\/repos\/.*\/.*\/attestations$/,
method: 'post'
})
.reply(201, {id: attestationID})
})
describe('when the sigstore instance is explicitly set', () => {
it('attests provenance', async () => {
const attestation = await attestProvenance({
subjects: [{name: subjectName, digest: subjectDigest}],
token: 'token',
sigstore: 'public-good'
})
expect(attestation).toBeDefined()
expect(attestation.bundle).toBeDefined()
expect(attestation.certificate).toMatch(/-----BEGIN CERTIFICATE-----/)
expect(attestation.tlogID).toBeDefined()
expect(attestation.attestationID).toBe(attestationID)
})
})
describe('when the sigstore instance is inferred from the repo visibility', () => {
const savedRepository = github.context.payload.repository
beforeEach(() => {
/* eslint-disable-next-line @typescript-eslint/no-explicit-any */
github.context.payload.repository = {visibility: 'public'} as any
})
afterEach(() => {
github.context.payload.repository = savedRepository
})
it('attests provenance', async () => {
const attestation = await attestProvenance({
subjects: [{name: subjectName, digest: subjectDigest}],
token: 'token'
})
expect(attestation).toBeDefined()
expect(attestation.bundle).toBeDefined()
expect(attestation.certificate).toMatch(/-----BEGIN CERTIFICATE-----/)
expect(attestation.tlogID).toBeDefined()
expect(attestation.attestationID).toBe(attestationID)
})
})
})
describe('when skipWrite is set to true', () => {
const {fulcioURL, rekorURL} = SIGSTORE_PUBLIC_GOOD
beforeEach(async () => {
// Mock Sigstore
await mockFulcio({baseURL: fulcioURL, strict: false})
await mockRekor({baseURL: rekorURL})
})
it('attests provenance', async () => {
const attestation = await attestProvenance({
subjectName,
subjectDigest,
token: 'token',
sigstore: 'public-good',
skipWrite: true
})
expect(attestation).toBeDefined()
expect(attestation.bundle).toBeDefined()
expect(attestation.certificate).toMatch(/-----BEGIN CERTIFICATE-----/)
expect(attestation.tlogID).toBeDefined()
expect(attestation.attestationID).toBeUndefined()
})
})
})
})

View File

@ -0,0 +1,101 @@
import {mockFulcio, mockRekor, mockTSA} from '@sigstore/mock'
import nock from 'nock'
import {Payload, signPayload} from '../src/sign'
describe('signProvenance', () => {
const originalEnv = process.env
// Fake an OIDC token
const subject = 'foo@bar.com'
const oidcPayload = {sub: subject, iss: ''}
const oidcToken = `.${Buffer.from(JSON.stringify(oidcPayload)).toString(
'base64'
)}.}`
// Dummy provenance to be signed
const provenance = {
_type: 'https://in-toto.io/Statement/v1',
subject: {
name: 'subjective',
digest: {
sha256:
'7d070f6b64d9bcc530fe99cc21eaaa4b3c364e0b2d367d7735671fa202a03b32'
}
}
}
const payload: Payload = {
body: Buffer.from(JSON.stringify(provenance)),
type: 'application/vnd.in-toto+json'
}
const fulcioURL = 'https://fulcio.url'
const rekorURL = 'https://rekor.url'
const tsaServerURL = 'https://tsa.url'
beforeEach(() => {
// Mock OIDC token endpoint
const tokenURL = 'https://token.url'
process.env = {
...originalEnv,
ACTIONS_ID_TOKEN_REQUEST_URL: tokenURL,
ACTIONS_ID_TOKEN_REQUEST_TOKEN: 'token'
}
nock(tokenURL)
.get('/')
.query({audience: 'sigstore'})
.reply(200, {value: oidcToken})
})
afterEach(() => {
process.env = originalEnv
})
describe('when visibility is public', () => {
beforeEach(async () => {
await mockFulcio({baseURL: fulcioURL, strict: false})
await mockRekor({baseURL: rekorURL})
})
it('returns a bundle', async () => {
const att = await signPayload(payload, {fulcioURL, rekorURL})
expect(att).toBeDefined()
expect(att.mediaType).toEqual(
'application/vnd.dev.sigstore.bundle.v0.3+json'
)
expect(att.content.$case).toEqual('dsseEnvelope')
expect(att.verificationMaterial.content.$case).toEqual('certificate')
expect(att.verificationMaterial.tlogEntries).toHaveLength(1)
expect(
att.verificationMaterial.timestampVerificationData?.rfc3161Timestamps
).toHaveLength(0)
})
})
describe('when visibility is private', () => {
beforeEach(async () => {
await mockFulcio({baseURL: fulcioURL, strict: false})
await mockTSA({baseURL: tsaServerURL})
})
it('returns a bundle', async () => {
const att = await signPayload(payload, {fulcioURL, tsaServerURL})
expect(att).toBeDefined()
expect(att.mediaType).toEqual(
'application/vnd.dev.sigstore.bundle.v0.3+json'
)
expect(att.content.$case).toEqual('dsseEnvelope')
expect(att.verificationMaterial.content.$case).toEqual('certificate')
expect(att.verificationMaterial.tlogEntries).toHaveLength(0)
expect(
att.verificationMaterial.timestampVerificationData?.rfc3161Timestamps
).toHaveLength(1)
})
})
})

View File

@ -0,0 +1,93 @@
import {MockAgent, setGlobalDispatcher} from 'undici'
import {writeAttestation} from '../src/store'
describe('writeAttestation', () => {
const originalEnv = process.env
const attestation = {foo: 'bar '}
const token = 'token'
const headers = {'X-GitHub-Foo': 'true'}
const mockAgent = new MockAgent()
setGlobalDispatcher(mockAgent)
beforeEach(() => {
process.env = {
...originalEnv,
GITHUB_REPOSITORY: 'foo/bar'
}
})
afterEach(() => {
process.env = originalEnv
})
describe('when the api call is successful', () => {
beforeEach(() => {
mockAgent
.get('https://api.github.com')
.intercept({
path: '/repos/foo/bar/attestations',
method: 'POST',
headers: {authorization: `token ${token}`, ...headers},
body: JSON.stringify({bundle: attestation})
})
.reply(201, {id: '123'})
})
it('persists the attestation', async () => {
await expect(
writeAttestation(attestation, token, {headers})
).resolves.toEqual('123')
})
})
describe('when the api call fails', () => {
beforeEach(() => {
mockAgent
.get('https://api.github.com')
.intercept({
path: '/repos/foo/bar/attestations',
method: 'POST',
headers: {authorization: `token ${token}`},
body: JSON.stringify({bundle: attestation})
})
.reply(500, 'oops')
})
it('throws an error', async () => {
await expect(
writeAttestation(attestation, token, {retry: 0})
).rejects.toThrow(/oops/)
})
})
describe('when the api call fails but succeeds on retry', () => {
beforeEach(() => {
const pool = mockAgent.get('https://api.github.com')
pool
.intercept({
path: '/repos/foo/bar/attestations',
method: 'POST',
headers: {authorization: `token ${token}`},
body: JSON.stringify({bundle: attestation})
})
.reply(500, 'oops')
.times(1)
pool
.intercept({
path: '/repos/foo/bar/attestations',
method: 'POST',
headers: {authorization: `token ${token}`},
body: JSON.stringify({bundle: attestation})
})
.reply(201, {id: '123'})
.times(1)
})
it('persists the attestation', async () => {
await expect(writeAttestation(attestation, token)).resolves.toEqual('123')
})
})
})

1810
packages/attest/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,58 @@
{
"name": "@actions/attest",
"version": "2.0.0",
"description": "Actions attestation lib",
"keywords": [
"github",
"actions",
"attestation"
],
"homepage": "https://github.com/actions/toolkit/tree/main/packages/attest",
"license": "MIT",
"main": "lib/index.js",
"types": "lib/index.d.ts",
"directories": {
"lib": "lib",
"test": "__tests__"
},
"files": [
"lib"
],
"publishConfig": {
"access": "public",
"provenance": true
},
"repository": {
"type": "git",
"url": "git+https://github.com/actions/toolkit.git",
"directory": "packages/attest"
},
"scripts": {
"test": "echo \"Error: run tests from root\" && exit 1",
"tsc": "tsc"
},
"bugs": {
"url": "https://github.com/actions/toolkit/issues"
},
"devDependencies": {
"@sigstore/mock": "^0.10.0",
"@sigstore/rekor-types": "^3.0.0",
"@types/jsonwebtoken": "^9.0.6",
"nock": "^13.5.1",
"undici": "^6.20.0"
},
"dependencies": {
"@actions/core": "^1.11.1",
"@actions/github": "^6.0.0",
"@actions/http-client": "^2.2.3",
"@octokit/plugin-retry": "^6.0.1",
"@sigstore/bundle": "^3.1.0",
"@sigstore/sign": "^3.1.0",
"jose": "^5.10.0"
},
"overrides": {
"@octokit/plugin-retry": {
"@octokit/core": "^5.2.0"
}
}
}

View File

@ -0,0 +1,117 @@
import {bundleToJSON} from '@sigstore/bundle'
import {X509Certificate} from 'crypto'
import {SigstoreInstance, signingEndpoints} from './endpoints'
import {buildIntotoStatement} from './intoto'
import {Payload, signPayload} from './sign'
import {writeAttestation} from './store'
import type {Bundle} from '@sigstore/sign'
import type {Attestation, Predicate, Subject} from './shared.types'
const INTOTO_PAYLOAD_TYPE = 'application/vnd.in-toto+json'
/**
* Options for attesting a subject / predicate.
*/
export type AttestOptions = {
/**
* @deprecated Use `subjects` instead.
**/
subjectName?: string
/**
* @deprecated Use `subjects` instead.
**/
subjectDigest?: Record<string, string>
// Subjects to be attested.
subjects?: Subject[]
// Content type of the predicate being attested.
predicateType: string
// Predicate to be attested.
predicate: object
// GitHub token for writing attestations.
token: string
// Sigstore instance to use for signing. Must be one of "public-good" or
// "github".
sigstore?: SigstoreInstance
// HTTP headers to include in request to attestations API.
headers?: {[header: string]: string | number | undefined}
// Whether to skip writing the attestation to the GH attestations API.
skipWrite?: boolean
}
/**
* Generates an attestation for the given subject and predicate. The subject and
* predicate are combined into an in-toto statement, which is then signed using
* the identified Sigstore instance and stored as an attestation.
* @param options - The options for attestation.
* @returns A promise that resolves to the attestation.
*/
export async function attest(options: AttestOptions): Promise<Attestation> {
let subjects: Subject[]
if (options.subjects) {
subjects = options.subjects
} else if (options.subjectName && options.subjectDigest) {
subjects = [{name: options.subjectName, digest: options.subjectDigest}]
} else {
throw new Error(
'Must provide either subjectName and subjectDigest or subjects'
)
}
const predicate: Predicate = {
type: options.predicateType,
params: options.predicate
}
const statement = buildIntotoStatement(subjects, predicate)
// Sign the provenance statement
const payload: Payload = {
body: Buffer.from(JSON.stringify(statement)),
type: INTOTO_PAYLOAD_TYPE
}
const endpoints = signingEndpoints(options.sigstore)
const bundle = await signPayload(payload, endpoints)
// Store the attestation
let attestationID: string | undefined
if (options.skipWrite !== true) {
attestationID = await writeAttestation(
bundleToJSON(bundle),
options.token,
{headers: options.headers}
)
}
return toAttestation(bundle, attestationID)
}
function toAttestation(bundle: Bundle, attestationID?: string): Attestation {
let certBytes: Buffer
switch (bundle.verificationMaterial.content.$case) {
case 'x509CertificateChain':
certBytes =
bundle.verificationMaterial.content.x509CertificateChain.certificates[0]
.rawBytes
break
case 'certificate':
certBytes = bundle.verificationMaterial.content.certificate.rawBytes
break
default:
throw new Error('Bundle must contain an x509 certificate')
}
const signingCert = new X509Certificate(certBytes)
// Collect transparency log ID if available
const tlogEntries = bundle.verificationMaterial.tlogEntries
const tlogID = tlogEntries.length > 0 ? tlogEntries[0].logIndex : undefined
return {
bundle: bundleToJSON(bundle),
certificate: signingCert.toString(),
tlogID,
attestationID
}
}

View File

@ -0,0 +1,55 @@
import * as github from '@actions/github'
const PUBLIC_GOOD_ID = 'public-good'
const GITHUB_ID = 'github'
const FULCIO_PUBLIC_GOOD_URL = 'https://fulcio.sigstore.dev'
const REKOR_PUBLIC_GOOD_URL = 'https://rekor.sigstore.dev'
export type SigstoreInstance = typeof PUBLIC_GOOD_ID | typeof GITHUB_ID
export type Endpoints = {
fulcioURL: string
rekorURL?: string
tsaServerURL?: string
}
export const SIGSTORE_PUBLIC_GOOD: Endpoints = {
fulcioURL: FULCIO_PUBLIC_GOOD_URL,
rekorURL: REKOR_PUBLIC_GOOD_URL
}
export const signingEndpoints = (sigstore?: SigstoreInstance): Endpoints => {
let instance: SigstoreInstance
// An explicitly set instance type takes precedence, but if not set, use the
// repository's visibility to determine the instance type.
if (sigstore && [PUBLIC_GOOD_ID, GITHUB_ID].includes(sigstore)) {
instance = sigstore
} else {
instance =
github.context.payload.repository?.visibility === 'public'
? PUBLIC_GOOD_ID
: GITHUB_ID
}
switch (instance) {
case PUBLIC_GOOD_ID:
return SIGSTORE_PUBLIC_GOOD
case GITHUB_ID:
return buildGitHubEndpoints()
}
}
function buildGitHubEndpoints(): Endpoints {
const serverURL = process.env.GITHUB_SERVER_URL || 'https://github.com'
let host = new URL(serverURL).hostname
if (host === 'github.com') {
host = 'githubapp.com'
}
return {
fulcioURL: `https://fulcio.${host}`,
tsaServerURL: `https://timestamp.${host}`
}
}

View File

@ -0,0 +1,9 @@
export {AttestOptions, attest} from './attest'
export {
AttestProvenanceOptions,
attestProvenance,
buildSLSAProvenancePredicate
} from './provenance'
export type {SerializedBundle} from '@sigstore/bundle'
export type {Attestation, Predicate, Subject} from './shared.types'

View File

@ -0,0 +1,32 @@
import {Predicate, Subject} from './shared.types'
const INTOTO_STATEMENT_V1_TYPE = 'https://in-toto.io/Statement/v1'
/**
* An in-toto statement.
* https://github.com/in-toto/attestation/blob/main/spec/v1/statement.md
*/
export type InTotoStatement = {
_type: string
subject: Subject[]
predicateType: string
predicate: object
}
/**
* Assembles the given subject and predicate into an in-toto statement.
* @param subject - The subject of the statement.
* @param predicate - The predicate of the statement.
* @returns The constructed in-toto statement.
*/
export const buildIntotoStatement = (
subjects: Subject[],
predicate: Predicate
): InTotoStatement => {
return {
_type: INTOTO_STATEMENT_V1_TYPE,
subject: subjects,
predicateType: predicate.type,
predicate: predicate.params
}
}

117
packages/attest/src/oidc.ts Normal file
View File

@ -0,0 +1,117 @@
import {getIDToken} from '@actions/core'
import {HttpClient} from '@actions/http-client'
import * as jose from 'jose'
const OIDC_AUDIENCE = 'nobody'
const VALID_SERVER_URLS = [
'https://github.com',
new RegExp('^https://[a-z0-9-]+\\.ghe\\.com$')
] as const
const REQUIRED_CLAIMS = [
'iss',
'ref',
'sha',
'repository',
'event_name',
'job_workflow_ref',
'workflow_ref',
'repository_id',
'repository_owner_id',
'runner_environment',
'run_id',
'run_attempt'
] as const
export type ClaimSet = {[K in (typeof REQUIRED_CLAIMS)[number]]: string}
type OIDCConfig = {
jwks_uri: string
}
export const getIDTokenClaims = async (issuer?: string): Promise<ClaimSet> => {
issuer = issuer || getIssuer()
try {
const token = await getIDToken(OIDC_AUDIENCE)
const claims = await decodeOIDCToken(token, issuer)
assertClaimSet(claims)
return claims
} catch (error) {
throw new Error(`Failed to get ID token: ${error.message}`)
}
}
const decodeOIDCToken = async (
token: string,
issuer: string
): Promise<jose.JWTPayload> => {
// Verify and decode token
const jwks = jose.createLocalJWKSet(await getJWKS(issuer))
const {payload} = await jose.jwtVerify(token, jwks, {
audience: OIDC_AUDIENCE
})
if (!payload.iss) {
throw new Error('Missing "iss" claim')
}
// Check that the issuer STARTS WITH the expected issuer URL to account for
// the fact that the value may include an enterprise-specific slug
if (!payload.iss.startsWith(issuer)) {
throw new Error(`Unexpected "iss" claim: ${payload.iss}`)
}
return payload
}
const getJWKS = async (issuer: string): Promise<jose.JSONWebKeySet> => {
const client = new HttpClient('@actions/attest')
const config = await client.getJson<OIDCConfig>(
`${issuer}/.well-known/openid-configuration`
)
if (!config.result) {
throw new Error('No OpenID configuration found')
}
const jwks = await client.getJson<jose.JSONWebKeySet>(config.result.jwks_uri)
if (!jwks.result) {
throw new Error('No JWKS found for issuer')
}
return jwks.result
}
function assertClaimSet(claims: jose.JWTPayload): asserts claims is ClaimSet {
const missingClaims: string[] = []
for (const claim of REQUIRED_CLAIMS) {
if (!(claim in claims)) {
missingClaims.push(claim)
}
}
if (missingClaims.length > 0) {
throw new Error(`Missing claims: ${missingClaims.join(', ')}`)
}
}
// Derive the current OIDC issuer based on the server URL
function getIssuer(): string {
const serverURL = process.env.GITHUB_SERVER_URL || 'https://github.com'
// Ensure the server URL is a valid GitHub server URL
if (!VALID_SERVER_URLS.some(valid_url => serverURL.match(valid_url))) {
throw new Error(`Invalid server URL: ${serverURL}`)
}
let host = new URL(serverURL).hostname
if (host === 'github.com') {
host = 'githubusercontent.com'
}
return `https://token.actions.${host}`
}

View File

@ -0,0 +1,95 @@
import {attest, AttestOptions} from './attest'
import {getIDTokenClaims} from './oidc'
import type {Attestation, Predicate} from './shared.types'
const SLSA_PREDICATE_V1_TYPE = 'https://slsa.dev/provenance/v1'
const GITHUB_BUILD_TYPE = 'https://actions.github.io/buildtypes/workflow/v1'
export type AttestProvenanceOptions = Omit<
AttestOptions,
'predicate' | 'predicateType'
> & {
issuer?: string
}
/**
* Builds an SLSA (Supply Chain Levels for Software Artifacts) provenance
* predicate using the GitHub Actions Workflow build type.
* https://slsa.dev/spec/v1.0/provenance
* https://github.com/slsa-framework/github-actions-buildtypes/tree/main/workflow/v1
* @param issuer - URL for the OIDC issuer. Defaults to the GitHub Actions token
* issuer.
* @returns The SLSA provenance predicate.
*/
export const buildSLSAProvenancePredicate = async (
issuer?: string
): Promise<Predicate> => {
const serverURL = process.env.GITHUB_SERVER_URL
const claims = await getIDTokenClaims(issuer)
// Split just the path and ref from the workflow string.
// owner/repo/.github/workflows/main.yml@main =>
// .github/workflows/main.yml, main
const [workflowPath] = claims.workflow_ref
.replace(`${claims.repository}/`, '')
.split('@')
return {
type: SLSA_PREDICATE_V1_TYPE,
params: {
buildDefinition: {
buildType: GITHUB_BUILD_TYPE,
externalParameters: {
workflow: {
ref: claims.ref,
repository: `${serverURL}/${claims.repository}`,
path: workflowPath
}
},
internalParameters: {
github: {
event_name: claims.event_name,
repository_id: claims.repository_id,
repository_owner_id: claims.repository_owner_id,
runner_environment: claims.runner_environment
}
},
resolvedDependencies: [
{
uri: `git+${serverURL}/${claims.repository}@${claims.ref}`,
digest: {
gitCommit: claims.sha
}
}
]
},
runDetails: {
builder: {
id: `${serverURL}/${claims.job_workflow_ref}`
},
metadata: {
invocationId: `${serverURL}/${claims.repository}/actions/runs/${claims.run_id}/attempts/${claims.run_attempt}`
}
}
}
}
}
/**
* Attests the build provenance of the provided subject. Generates the SLSA
* build provenance predicate, assembles it into an in-toto statement, and
* attests it.
*
* @param options - The options for attesting the provenance.
* @returns A promise that resolves to the attestation.
*/
export async function attestProvenance(
options: AttestProvenanceOptions
): Promise<Attestation> {
const predicate = await buildSLSAProvenancePredicate(options.issuer)
return attest({
...options,
predicateType: predicate.type,
predicate: predicate.params
})
}

View File

@ -0,0 +1,52 @@
import type {SerializedBundle} from '@sigstore/bundle'
/*
* The subject of an attestation.
*/
export type Subject = {
/*
* Name of the subject.
*/
name: string
/*
* Digests of the subject. Should be a map of digest algorithms to their hex-encoded values.
*/
digest: Record<string, string>
}
/*
* The predicate of an attestation.
*/
export type Predicate = {
/*
* URI identifying the content type of the predicate.
*/
type: string
/*
* Predicate parameters.
*/
params: object
}
/*
* Artifact attestation.
*/
export type Attestation = {
/*
* Serialized Sigstore bundle containing the provenance attestation,
* signature, signing certificate and witnessed timestamp.
*/
bundle: SerializedBundle
/*
* PEM-encoded signing certificate used to sign the attestation.
*/
certificate: string
/*
* ID of Rekor transparency log entry created for the attestation.
*/
tlogID?: string
/*
* ID of the persisted attestation (accessible via the GH API).
*/
attestationID?: string
}

109
packages/attest/src/sign.ts Normal file
View File

@ -0,0 +1,109 @@
import {
Bundle,
BundleBuilder,
CIContextProvider,
DSSEBundleBuilder,
FulcioSigner,
RekorWitness,
TSAWitness,
Witness
} from '@sigstore/sign'
const OIDC_AUDIENCE = 'sigstore'
const DEFAULT_TIMEOUT = 10000
const DEFAULT_RETRIES = 3
/**
* The payload to be signed (body) and its media type (type).
*/
export type Payload = {
body: Buffer
type: string
}
/**
* Options for signing a document.
*/
export type SignOptions = {
/**
* The URL of the Fulcio service.
*/
fulcioURL: string
/**
* The URL of the Rekor service.
*/
rekorURL?: string
/**
* The URL of the TSA (Time Stamping Authority) server.
*/
tsaServerURL?: string
/**
* The timeout duration in milliseconds when communicating with Sigstore
* services.
*/
timeout?: number
/**
* The number of retry attempts.
*/
retry?: number
}
/**
* Signs the provided payload with a Sigstore-issued certificate and returns the
* signature bundle.
* @param payload Payload to be signed.
* @param options Signing options.
* @returns A promise that resolves to the Sigstore signature bundle.
*/
export const signPayload = async (
payload: Payload,
options: SignOptions
): Promise<Bundle> => {
const artifact = {
data: payload.body,
type: payload.type
}
// Sign the artifact and build the bundle
return initBundleBuilder(options).create(artifact)
}
// Assembles the Sigstore bundle builder with the appropriate options
const initBundleBuilder = (opts: SignOptions): BundleBuilder => {
const identityProvider = new CIContextProvider(OIDC_AUDIENCE)
const timeout = opts.timeout || DEFAULT_TIMEOUT
const retry = opts.retry || DEFAULT_RETRIES
const witnesses: Witness[] = []
const signer = new FulcioSigner({
identityProvider,
fulcioBaseURL: opts.fulcioURL,
timeout,
retry
})
if (opts.rekorURL) {
witnesses.push(
new RekorWitness({
rekorBaseURL: opts.rekorURL,
fetchOnConflict: true,
timeout,
retry
})
)
}
if (opts.tsaServerURL) {
witnesses.push(
new TSAWitness({
tsaBaseURL: opts.tsaServerURL,
timeout,
retry
})
)
}
// Build the bundle with the singleCertificate option which will
// trigger the creation of v0.3 DSSE bundles
return new DSSEBundleBuilder({signer, witnesses})
}

View File

@ -0,0 +1,48 @@
import * as github from '@actions/github'
import {retry} from '@octokit/plugin-retry'
import {RequestHeaders} from '@octokit/types'
const CREATE_ATTESTATION_REQUEST = 'POST /repos/{owner}/{repo}/attestations'
const DEFAULT_RETRY_COUNT = 5
export type WriteOptions = {
retry?: number
headers?: RequestHeaders
}
/**
* Writes an attestation to the repository's attestations endpoint.
* @param attestation - The attestation to write.
* @param token - The GitHub token for authentication.
* @returns The ID of the attestation.
* @throws Error if the attestation fails to persist.
*/
export const writeAttestation = async (
attestation: unknown,
token: string,
options: WriteOptions = {}
): Promise<string> => {
const retries = options.retry ?? DEFAULT_RETRY_COUNT
const octokit = github.getOctokit(token, {retry: {retries}}, retry)
try {
const response = await octokit.request(CREATE_ATTESTATION_REQUEST, {
owner: github.context.repo.owner,
repo: github.context.repo.repo,
headers: options.headers,
bundle: attestation as {
mediaType?: string
verificationMaterial?: {[key: string]: unknown}
dsseEnvelope?: {[key: string]: unknown}
}
})
const data =
typeof response.data == 'string'
? JSON.parse(response.data)
: response.data
return data?.id
} catch (err) {
const message = err instanceof Error ? err.message : err
throw new Error(`Failed to persist attestation: ${message}`)
}
}

View File

@ -0,0 +1,12 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"baseUrl": "./",
"outDir": "./lib",
"declaration": true,
"rootDir": "./src"
},
"include": [
"./src"
]
}

View File

@ -6,6 +6,20 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac
Note that GitHub will remove any cache entries that have not been accessed in over 7 days. There is no limit on the number of caches you can store, but the total size of all caches in a repository is limited to 10 GB. If you exceed this limit, GitHub will save your cache but will begin evicting caches until the total size is less than 10 GB.
## ⚠️ Important changes
The cache backend service has been rewritten from the ground up for improved performance and reliability. The [@actions/cache](https://github.com/actions/toolkit/tree/main/packages/cache) package now integrates with the new cache service (v2) APIs.
The new service will gradually roll out as of **February 1st, 2025**. The legacy service will also be sunset on the same date. Changes in this release are **fully backward compatible**.
**All previous versions of this package will be deprecated**. We recommend upgrading to version `4.0.0` as soon as possible before **February 1st, 2025.**
If you do not upgrade, all workflow runs using any of the deprecated [@actions/cache](https://github.com/actions/toolkit/tree/main/packages/cache) packages will fail.
Upgrading to the recommended version should not break or require any changes to your workflows beyond updating your `package.json` to version `4.0.0`.
Read more about the change & access the migration guide: [reference to the announcement](https://github.com/actions/toolkit/discussions/1890).
## Usage
This package is used by the v2+ versions of our first party cache action. You can find an example implementation in the cache repo [here](https://github.com/actions/cache).
@ -47,5 +61,3 @@ const cacheKey = await cache.restoreCache(paths, key, restoreKeys)
A cache gets downloaded in multiple segments of fixed sizes (now `128MB` to fail-fast, previously `1GB` for a `32-bit` runner and `2GB` for a `64-bit` runner were used). Sometimes, a segment download gets stuck which causes the workflow job to be stuck forever and fail. Version `v3.0.4` of cache package introduces a segment download timeout. The segment download timeout will allow the segment download to get aborted and hence allow the job to proceed with a cache miss.
Default value of this timeout is 10 minutes (starting `v3.2.1` and higher, previously 60 minutes in versions between `v.3.0.4` and `v3.2.0`, both included) and can be customized by specifying an [environment variable](https://docs.github.com/en/actions/learn-github-actions/environment-variables) named `SEGMENT_DOWNLOAD_TIMEOUT_MINS` with timeout value in minutes.

Some files were not shown because too many files have changed in this diff Show More