Skip to main content

Uploading storage and deployment data to the linked artifacts page

Associate packages and builds in your organization with storage and deployment data.

Who can use this feature?

Anyone with write access to an organization-owned repository

Organization accounts on any plan

The linked artifacts page includes storage records and deployment records for artifacts that you build in your organization. Metadata for each artifact is provided by your organization using one of the following methods:

  • A workflow containing one of GitHub's actions for artifact attestations
  • An integration with the JFrog Artifactory or Microsoft Defender for Cloud
  • A custom script using the artifact metadata REST API

The available methods depend on whether you are uploading a storage record or a deployment record. For more information about record types, see About linked artifacts.

Uploading a storage record

You can upload a storage record by creating an artifact attestation or enabling an integration with JFrog Artifactory. If you don't want to use these options, you must set up a custom integration with the REST API.

Attesting with GitHub Actions

You can upload a storage record for an artifact using GitHub's first-party actions for artifact attestations. You can do this in the same workflow you use to build the artifact. These actions create signed provenance and integrity guarantees for the software you build, as well as automatically uploading a storage record to the linked artifacts page.

The attest and attest-build-provenance actions automatically create storage records on the linked artifacts page if both:

  • The push-to-registry option is set to true
  • The workflow that includes the action has the artifact-metadata: write permission

For more information on using these actions, see Using artifact attestations to establish provenance for builds.

If the artifact does not require attestation, or if you want to upload deployment records or additional storage metadata, see the following sections.

Using the JFrog integration

This two-way integration automatically keeps your storage records on GitHub up to date with the artifact on JFrog. For example, attestations you create on GitHub are automatically uploaded to JFrog, and promoting an artifact to production on JFrog automatically adds the production context to the record on GitHub.

For setup instructions, see Get Started with JFrog Artifactory and GitHub Integration in the JFrog documentation.

Using the REST API

For artifacts that do not need to be attested and are not stored on JFrog, you can create a custom integration using the Create artifact metadata storage record API endpoint. You should configure your system to call the endpoint whenever an artifact is published to your chosen package repository.

Note

If the artifact is not associated with a provenance attestation on GitHub, the github_repository parameter is mandatory.

Uploading a deployment record

If you store artifacts in Microsoft Defender for Cloud (MDC), you can use an integration to automatically sync data to the linked artifacts page. Otherwise, you must set up a custom integration with the REST API.

Using the Microsoft Defender for Cloud integration

You can connect your MDC instance to your GitHub organization. MDC will automatically send deployment and runtime data to GitHub.

For setup instructions, see Quick Start: Connect your GitHub Environment to Microsoft Defender for Cloud in the documentation for MDC.

Note

The integration with Microsoft Defender for Cloud is in public preview and subject to change.

Using the REST API

The Create an artifact deployment record API endpoint allows systems to send deployment data for a specific artifact to GitHub, such as its name, digest, environments, cluster, and deployment. You should call this endpoint whenever an artifact is deployed to a new staging or production environment.

Note

If the artifact is not associated with a provenance attestation on GitHub, the github_repository parameter is mandatory.

Verifying an upload

To check that a record has been uploaded successfully, you can view the updated artifact in your organization settings. See Auditing your organization's builds on the linked artifacts page.

Removing unwanted records

It is not possible to delete an artifact from the linked artifacts page. However, you can update a storage record or deployment record to reflect an artifact's status. See Removing artifacts from the linked artifacts page.

GitHub Actions examples

You can upload data to the linked artifacts page in the same workflow you use to build and publish an artifact.

Generating an attestation

In the following example, we build and publish a Docker image, then use the ${{ steps.push.outputs.digest }} output in the next step to generate a provenance attestation.

The attest-build-provenance action automatically uploads a storage record to the linked artifacts page when push-to-registry: true is set and the workflow includes the artifact-metadata: write permission.


env:
  IMAGE_NAME: my-container-image
  ACR_ENDPOINT: my-registry.azurecr.io

jobs:
  generate-build:
    name: Build and publish Docker image
    runs-on: ubuntu-latest
    permissions:
      id-token: write
      contents: read
      attestations: write
      packages: write
      artifact-metadata: write

    steps:
      - name: Build and push Docker image
        id: push
        uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83
        with:
          context: .
          push: true
          tags: |
            ${{ env.ACR_ENDPOINT }}/${{ env.IMAGE_NAME }}:latest
            ${{ env.ACR_ENDPOINT }}/${{ env.IMAGE_NAME }}:${{ github.sha }}

      - name: Generate artifact attestation
        uses: actions/attest-build-provenance@v3
        with:
          subject-name: ${{ env.ACR_ENDPOINT }}/${{ env.IMAGE_NAME }}
          subject-digest: ${{ steps.push.outputs.digest }}
          push-to-registry: true

Using the REST API

Alternatively, if you are not generating an attestation, you can call the artifact metadata API directly.


env:
  IMAGE_NAME: my-container-image
  IMAGE_VERSION: 1.1.2
  ACR_ENDPOINT: my-registry.azurecr.io

jobs:
  generate-build:
    name: Build and publish Docker image
    runs-on: ubuntu-latest
    permissions:
      id-token: write
      contents: read
      packages: write
      artifact-metadata: write

    steps:
      - name: Build and push Docker image
        id: push
        uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83
        with:
          context: .
          push: true
          tags: |
              ${{ env.ACR_ENDPOINT }}/${{ env.IMAGE_NAME }}:latest
              ${{ env.ACR_ENDPOINT }}/${{ env.IMAGE_NAME }}:${{ github.sha }}

      - name: Create artifact metadata storage record
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        run: |
          jq -n --arg artifactName "${{ env.IMAGE_NAME }}" --arg artifactVersion "${{ env.IMAGE_VERSION }}" --arg artifactDigest "${{ steps.push.outputs.digest }}" '{"name": $artifactName, "digest": $artifactDigest, "version": $artifactVersion, "registry_url": "https://azurecr.io", "repository": "my-repository"}' > create-record.json

          gh api -X POST orgs/${{ github.repository_owner }}/artifacts/metadata/storage-record --input create-record.json
        shell: bash

Next steps

Once you have uploaded data, teams in your organization can use the context from storage and deployment data to prioritize security alerts. See Prioritizing Dependabot and code scanning alerts using production context.