Step options

These options are used to define the steps in a pipeline, including the required script property, which is used to run commands and scripts such as running compilers and tests.

Step options for pipelines

Steps accept the following options.

The Step property

The step option is used to define a build execution unit. Steps are executed in the order that they appear in the bitbucket-pipelines.yml file. A single pipeline can have up to 100 steps.

Each step in your pipeline will start a separate Docker container to run the commands configured in the script option. Each step can be customized using options such as:

  • image — to use a different Docker image.

  • max-time — to set the maximum time allowed for a step.

  • caches and services — for specific caches and services.

  • artifacts — to retain artifacts that subsequent steps can consume.

  • clone — to customize the Git clone operation for the step.

  • trigger — to set the step to manual, requiring manual before the pipeline can continue.

Propertystep

Required — Yes

Data type — Block of new-line separated key-value pairs (YAML spec - Block Mapping)

Allowed parent propertiesbranches, custom, default, pull-requests, steps, and tags.

Allowed child properties — Requires the script property. The following properties are optional: after-script, artifacts, caches, clone, condition, deployment, fail-fast, image, name, oidc, runs-on, services, size, and trigger.

Example — a pipeline with a single step

1 2 3 4 5 pipelines: default: - step: script: - echo "Hello, World!"

Example — a pipeline with sequential and parallel steps

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 pipelines: default: - step: # sequential step name: Build script: - ./build.sh - step: # sequential step name: Build script: - ./build.sh - parallel: # these 2 steps will run in parallel steps: - step: name: Integration 1 script: - ./integration-tests.sh --batch 1 - step: name: Integration 2 script: - ./integration-tests.sh --batch 2 - step: # non-parallel step script: - ./deploy.sh

The required script property

Script

The script property is used to specify a list of commands to be run for a step. The list of commands will be run in the order they are listed, without any automatic clean-up operations between the commands. We recommend that you move large scripts to a separate script file and call it from the bitbucket-pipelines.yml.

Propertyscript

Required — Yes

Data type — A list of Strings and/or pipe properties (YAML spec - Sequence)

Allowed parent propertiesstep

Allowed child propertiespipe (optional)

Example — using the script property to run a basic command

1 2 3 4 5 pipelines: default: - step: script: - echo "Hello, World!"

Example — using the script property to run a large script using a separate file

1 2 3 4 5 pipelines: default: - step: script: - ./long-build-script.sh

Example — using the script property to run a sequence of commands for a single step

1 2 3 4 5 6 7 pipelines: default: - step: script: - echo "Hello," - echo "World!" - ./build.sh

Example — using the script and pipe properties to run a pipe

1 2 3 4 5 6 7 8 9 10 11 12 13 pipelines: default: - step: name: Alert Opsgenie script: - echo "Sending an alert through Opsgenie" - pipe: atlassian/opsgenie-send-alert:latest variables: GENIE_KEY: $GENIE_KEY MESSAGE: "Danger, Will Robinson!" DESCRIPTION: "An Opsgenie alert sent from Bitbucket Pipelines" SOURCE: "Bitbucket Pipelines" PRIORITY: "P1"

General options

Size

The size option allocates additional memory to a step, or to the whole pipeline. By specifying the size of 2x, you'll have double the memory available. 1x (default) steps are allocated 4 GB of memory, and 2x steps are allocated 8 GB memory. Note that the memory allocated is shared by both the script in the step and any services on the step.

This option is available for steps run on the Bitbucket Cloud infrastructure and a Linux Docker self-hosted runner. Shell-based runners, such as the Windows PowerShell, macOS shell, and Linux shell runners use all available memory on the host machine.

2x steps use twice the number of build minutes of a 1x step.

Propertysize

Required — No

Data type — String

Allowed values — Either:

Default value1x

Allowed parent propertiesoptions and step

Example — using the size option to increase the memory available to all pipeline steps

1 2 3 4 5 6 7 8 options: size: 2x pipelines: default: - step: script: - echo "2x memory is probably not needed for an echo command"

Example — using the size option to increase the memory available to a pipeline step

1 2 3 4 5 6 pipelines: default: - step: size: 2x script: - echo "This step gets double the memory!"

After-script

The after-script option lists the commands that will run after the script in the step is completed, regardless of whether the script succeeds or fails. This could be useful for clean up commands, test coverage, notifications, or rollbacks you might want to run. The BITBUCKET_EXIT_CODE pipeline variable can be used to determine if the script in the step has succeed or failed.

If any commands in the after-script fail:

  • we won't run any more commands listed in the after-script of that step.

  • it will not affect the reported status of the step.

Propertyafter-script

Required — No

Data type — A list of Strings and/or pipe properties (YAML spec - Sequence)

Allowed parent propertiesstep

Allowed child propertiespipe (optional)

Example — using the after-script option to run a command after the script commands

1 2 3 4 5 6 7 8 9 pipelines: default: - step: name: Build and test script: - npm install - npm test after-script: - echo "after script has run!"

Name

A name for the step, stage, or pipe. The name will be shown in the Bitbucket Pipeline logs and the Bitbucket UI. Names should be unique (within the pipeline) and describe the step or the steps in the stage.

Propertyname

Required — No

Data type — String

Allowed parent propertiesstep, pipe, and stage

Example — using name to label a stage, a pipe, and two steps

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipelines: default: - stage: name: Build and test steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh - step: script: - pipe: atlassian/slack-notify:latest name: Send a message to Slack variables: WEBHOOK_URL: $SLACK_WEBHOOK PRETEXT: 'Hello, Slack!' MESSAGE: 'Hello, Slack!!'

Fail fast

fail-fast can be applied to all parallel steps or to a specific step in a parallel group:

  • If a step has fail-fast: false, then the step can fail without the whole parallel group stopping.

  • If a step has fail-fast: true, then the whole parallel group will stop if the step fails.

Propertyfail-fast

Required — No

Data type — Boolean

Allowed valuestrue or false

Default valuefalse

Allowed parent propertiesstep and parallel

Example — using fail-fast to stop parallel steps when one fails

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 pipelines: default: - step: name: Build script: - ./build.sh - parallel: # these option alows to force stop all running steps if any step fails fail-fast: true steps: - step: name: Integration 1 script: - ./integration-tests.sh --batch 1 - step: name: Integration 2 script: - ./integration-tests.sh --batch 2

Example — exclude a parallel from fail-fast

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 pipelines: default: - step: name: Build script: - ./build.sh - parallel: # these option alows to force stop all running steps if any step fails fail-fast: true steps: - step: name: Integration 1 script: - ./integration-tests.sh --batch 1 - step: name: Integration 2 script: - ./integration-tests.sh --batch 2 - step: # option can be disabled for a step # and its failure won't stop other steps in a group fail-fast: false name: Upload metadata script: - ./upload-metadata.sh

Caches and Artifacts

Caches

The step caches option is used to indicate steps where dependencies are downloaded from external sources (such as package repositories like npm, and PyPI). This allows the previously defined cache to be created, updated, or reused to avoid re-downloading external build dependencies. You can use a defined custom cache, or use one of the predefined caches. For a complete list of predefined caches, see Caches — Predefined caches.

For information on using the caches option, see Caches.

Propertycaches

Required — No

Data type — A list of Strings (YAML spec - Sequence)

Allowed values — Names of the Pre-defined Pipelines caches and the names of custom caches.

Allowed parent propertiesstep

Example — using the caches option to cache dependencies for a Ruby project

1 2 3 4 5 6 7 8 9 10 11 12 13 definitions: caches: my-bundler-cache: vendor/bundle pipelines: default: - step: caches: - my-bundler-cache # Cache is defined above in the definitions section - node # Pre-defined Pipelines cache script: - bundle install --path vendor/bundle - ruby -e 'print "Hello, World\n"'

Artifacts

The artifacts option is used to list the files or directories that contain build artifacts that are required for steps later in the pipeline. The artifact paths are relative to the BITBUCKET_CLONE_DIR variable and can be defined using glob patterns. The artifact paths are relative to the BITBUCKET_CLONE_DIR variable and can be defined using glob patterns.

For details on artifacts, see using artifacts in steps.

Propertyartifacts

Required — No

Data type — Either:

Allowed parent propertiesstep

Allowed child propertiesdownload and paths

Example — using artifacts to pass built code to a following step for testing

1 2 3 4 5 6 7 8 9 10 11 12 13 pipelines: default: - step: name: Build and test script: - npm install - npm run build artifacts: - dist/** - step: name: Test code from build step stored in the dist/ directory script: - npm test

Example — using artifacts, download, and paths to pass artifacts to later steps

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipelines: default: - step: name: Build and test script: - npm install - npm run build artifacts: # Store build artifacts for use in the following steps - dist/** - step: name: lint code and store results script: - npm lint > results.txt artifacts: download: false # Block artifacts downloading, they're not needed for this step paths: # Store the linting result (in addition to the dist/ directory) - results.txt - step: name: Test code from build step stored in the dist/ directory script: - npm test

Artifact downloads

The artifacts download option is used to control whether artifacts from previous steps are downloaded at the start of the step.

  • download: true — (default behavior) the artifacts from previous steps will be started at the start of step and the artifacts will be available for the scripts in the step.

  • download: false — artifacts from previous steps will not be available in this step.

For details on using artifacts, see using artifacts in steps.

Propertydownload

Required — No

Data type — Boolean

Allowed valuestrue or false

Default valueTrue

Allowed parent propertiesartifacts

Example — using the download option to prevent artifacts from downloading in a step
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 pipelines: default: - step: name: Build and test script: - npm install - npm run build artifacts: # Store build artifacts for use in the following steps - dist/** - step: name: lint code and store results script: - npm lint > results.txt artifacts: download: false # Block artifacts downloading, they're not needed for this step - step: name: Test code from build step stored in the dist/ directory script: - npm test
Example — using the artifacts download and paths options together
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipelines: default: - step: name: Build and test script: - npm install - npm run build artifacts: # Store build artifacts for use in the following steps - dist/** - step: name: lint code and store results script: - npm lint > results.txt artifacts: download: false # Block artifacts downloading, they're not needed for this step paths: # Store the linting result (in addition to the dist/ directory) - results.txt - step: name: Test code from build step stored in the dist/ directory script: - npm test

Artifact Paths

The artifact paths option is used to list the files or directories that contain build artifacts that are required for steps later in the pipeline. The paths option is only needed if the artifacts download option is defined, otherwise these paths can be listed under the artifacts option. The artifact paths are relative to the BITBUCKET_CLONE_DIR variable and can be defined using glob patterns.

For details on artifacts, see using artifacts in steps.

Propertypaths

Required — No

Data type — A list of file paths (glob patterns are allowed) (YAML spec - Sequence)

Allowed parent propertiesartifacts

Example — using the artifacts paths option to retain files for use in later steps
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipelines: default: - step: name: Build and test script: - npm install - npm run build artifacts: # Store build artifacts for use in the following steps - dist/** - step: name: lint code and store results script: - npm lint > results.txt artifacts: download: false # Block artifacts downloading, they're not needed for this step paths: # Store the linting result (in addition to the dist/ directory) - results.txt - step: name: Test code from build step stored in the dist/ directory script: - npm test

Pipes

Pipe

Pipes make complex tasks easier, by doing a lot of the work behind the scenes. This means you can just select which pipe you want to use, and supply the necessary variables. You can look at the repository for the pipe to see what commands it is running.

For information on Pipes, including how to create and use custom Pipes, see Use pipes in Bitbucket Pipelines.

For a list of available Pipes and instructions on how to use them, see Bitbucket Pipes Integrations.

Propertypipe

Required — No

Data type — String

Allowed values — Address of a Docker-based pipe.

Allowed parent propertiesscript and after-script

Allowed child propertiesvariables (required for most pipes) and name (optional)

Example — using the pipe option to send an Opsgenie alert using the official Opsgenie pipe

1 2 3 4 5 6 7 8 9 10 11 12 pipelines: default: - step: name: Alert Opsgenie script: - pipe: atlassian/opsgenie-send-alert:latest variables: GENIE_KEY: $GENIE_KEY MESSAGE: "Danger, Will Robinson!" DESCRIPTION: "An Opsgenie alert sent from Bitbucket Pipelines" SOURCE: "Bitbucket Pipelines" PRIORITY: "P1"

Example — using pipes to send a message to Opsgenie and Slack

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 pipelines: default: - step: name: Alert everyone! script: - pipe: atlassian/opsgenie-send-alert:latest name: Send alert to Opsgenie variables: GENIE_KEY: $GENIE_KEY MESSAGE: 'Wake up!' - pipe: atlassian/slack-notify:latest name: Send alert to Slack variables: WEBHOOK_URL: $SLACK_WEBHOOK PRETEXT: 'Alert Everyone!' MESSAGE: 'We have a problem!'

Example — using the pipe option to run a custom pipe

1 2 3 4 5 6 7 8 9 pipelines: default: - step: name: Running my custom pipe script: - pipe: docker://<DockerAccountName>/<ImageName>:<version> variables: USERNAME: $My_username PASSWORD: $Password

Pipe variables

The pipe variables options is used to configure the environmental variables of a pipe. The variables required or available varies between pipes.

For information on Pipes, including how to create and use custom Pipes, see Use pipes in Bitbucket Pipelines.

For a list of available Pipes and instructions on how to use them, see Bitbucket Pipes Integrations.

Secrets and login credentials should be stored as user-defined pipeline variables to avoid being leaked. For details, see Variables and secrets — User-defined variables.

Propertyvariables

Required — Varies between pipes

Data type — Block of new-line separated key-value pairs (YAML spec - Block Mapping)

Allowed parent propertiespipe

Example — using variables to configure and run two pipes

The following example shows the Opsgenie Send Alert pipe and the Slack Notify pipe.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 pipelines: default: - step: name: Alert everyone! script: - pipe: atlassian/opsgenie-send-alert:latest name: Send alert to Opsgenie variables: GENIE_KEY: $GENIE_KEY MESSAGE: 'Wake up!' - pipe: atlassian/slack-notify:latest name: Send alert to Slack variables: WEBHOOK_URL: $SLACK_WEBHOOK PRETEXT: 'Alert Everyone!' MESSAGE: 'We have a problem!'

Runners

Runs-on

Only available for self-hosted pipeline runners.

To run a pipeline step on a self-hosted runner, add the runs-on option to the step. When the pipeline is run, the step will run on the next available runner that has all the listed labels. If all matching runners are busy, your step will wait until one becomes available again. If you don't have any online runners in your repository that match all labels, the step will fail.

For information on:

Propertyruns-on

Required — No

Data type — Either:

Allowed values — Any Label assigned to a self-hosted Repository or Workspace Pipeline runner (such as self.hosted).

Allowed parent propertiesstep

Example — using the runs-on option to run a step on a self-hosted runner

1 2 3 4 5 6 7 8 9 10 11 12 13 pipelines: default: - step: name: Step 1 runs-on: - 'self.hosted' - 'my.custom.label' script: - echo "This step will run on a self-hosted runner with the 'my.custom.label' and 'self.hosted' labels."; - step: name: Step 2 script: - echo "This step will run on Atlassian's infrastructure as usual.";

Docker images

For details on the image options, including using the image options for steps, see Docker image options.

Git clone options

For details on the clone options, including using the clone options for steps, see Git clone behavior.

Flow control

Condition

The condition option prevents a step or stage from running unless a condition or rule is satisfied. Currently, the only condition supported is changesets. Use changesets to execute a step or stage only if one of the modified files matches the expression in includePaths. The file match pattern specified in the includePaths is relative to the $BITBUCKET_CLONE_DIR directory.

Changes that are taken into account

In a pull-requests pipeline, all commits are taken into account, and if you provide an includePath list of patterns, the step or stage will be executed when at least one commit change matches one of the conditions. The format for pattern matching follows the glob patterns.

For other types of pipelines, only the last commit is considered. If you push multiple commits to branch at the same time or if you push multiple times to given branch, you might experience non-intuitive behavior when failing pipelines turn green only because the failing step or stage is skipped on the next run.

Propertycondition

Required — No

Data type — Block of new-line separated key-value pairs (YAML spec - Block Mapping)

Allowed parent propertiesstep and stage

Allowed child propertieschangesets (required)

Example — using the condition option to only run a step when certain files change

1 2 3 4 5 6 7 8 9 10 11 12 13 14 pipelines: default: - step: name: step1 script: - echo "failing paths" - exit 1 condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**"

Example — using the condition option to only run a stage when certain files change

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 pipelines: default: - stage: name: Build and test condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**" steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh

Condition changesets

The changesets option is used to indicate that the condition for a step or stage is a change in a particular file or files (includePaths).

Propertychangesets

Required — Required when using the condition option.

Data type — Block of new-line separated key-value pairs (YAML spec - Block Mapping)

Allowed parent propertiescondition

Allowed child propertiesincludePaths (required)

Example — using the condition option to only run a step when certain files change
1 2 3 4 5 6 7 8 9 10 11 12 13 14 pipelines: default: - step: name: step1 script: - echo "failing paths" - exit 1 condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**"
Example — using the condition option to only run a stage when certain files change
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 pipelines: default: - stage: name: Build and test condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**" steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh
Conditional changeset included directories

When used with the condition and changesets options, the includePaths option allows you to provide a list of files or directories to check for changes. If a file in the list is changed by a commit, the step or stage will run, otherwise the step will be skipped.

PropertyincludePaths

Required — No

Data type — A list of file paths (glob patterns are allowed) (YAML spec - Sequence)

Allowed parent propertieschangesets

Example — using the condition option to only run a step when certain files change
1 2 3 4 5 6 7 8 9 10 11 12 13 14 pipelines: default: - step: name: step1 script: - echo "failing paths" - exit 1 condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**"
Example — using the condition option to only run a stage when certain files change
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 pipelines: default: - stage: name: Build and test condition: changesets: includePaths: # only xml files directly under path1 directory - "path1/*.xml" # any changes in deeply nested directories under path2 - "path2/**" steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh

Trigger

Sets the stage to run automatically (default behavior) or only when manually triggered by a user in the Bitbucket user interface. The first stage in a pipeline can't be manual. To set a whole pipeline to run manually, use a custom pipeline trigger. Manual steps and stages:

  • Can’t be the first step or stage in a pipeline.

  • Can only be executed in the order that they are configured. You cannot skip a manual step or stage.

  • Can only be executed if the previous step or stage has successfully completed.

  • Can only be triggered by users with write access to the repository.

  • Are triggered through the Pipelines web interface.

If your build uses both manual steps and artifacts, the artifacts are stored for 14 days following the execution of the step that produced them. After this time, the artifacts expire and any manual steps and manual stages in the pipeline can no longer be executed.

Propertytrigger

Required — No

Data type — String

Allowed valuesautomatic and manual

Default valueautomatic

Allowed parent propertiesstep and stage

Example — using trigger to set a step to manual

1 2 3 4 5 6 7 8 9 10 11 12 13 pipelines: default: - step: name: Build script: - npm run build artifacts: - dist/** - step: name: Deploy trigger: manual script: - ./deploy.sh

Example — using trigger to set a stage to manual

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 pipelines: default: - stage: name: Linting steps: - step: script: - sh ./run-linter.sh - stage: name: Build and test trigger: manual steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh

Services — service containers and OIDC resources

OpenID Connect (OIDC) resources

The oidc option enables the use of OpenID Connect to connect a pipeline step to a resource server. The oidc value must be set to true to set up and configure OpenID Connect. For details on using OIDC with pipelines, see Integrate Pipelines with resource servers using OIDC.

Propertyoidc

Required — No

Data type — Boolean

Allowed valuestrue or false

Default valuefalse

Allowed parent propertiesstep

Example — using the oidc option to connect a pipeline step to a resource server

1 2 3 4 5 6 7 pipelines: default: - step: oidc: true script: - echo "I can access data through OpenID Connect!" - aws sts assume-role-with-web-identity --role-arn arn:aws:iam::XXXXXX:role/projectx-build --role-session-name build-session --web-identity-token "$BITBUCKET_STEP_OIDC_TOKEN" --duration-seconds 1000

Services

Bitbucket Pipelines can create separate Docker containers for services, which results in faster builds, and easy service editing. For details on creating services see Databases and service containers. This services option is used to indicate which steps require previously defined services.

Propertyservices

Required — No

Data type — A list of Strings (YAML spec - Sequence)

Allowed values — Names of services defined under definitions > services

Allowed parent propertiesstep

Example — using the step services option to indicate which step requires the my-service-name database service

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 definitions: services: my-service-name: image: mariadb:latest variables: MARIADB_USER: $MY_MARIADB_USER MARIADB_PASSWORD: $MY_MARIADB_PASSWORD MARIADB_ROOT_PASSWORD: $MARIADB_ADMIN_PASSWORD pipelines: default: - step: name: Hello world example services: - my-service-name script: - echo "Hello, World"

Deployments

Deployment

Sets the environment for a Deployment stage or step and is used to organize the Deployment dashboard. All steps that belong to the Deployment stage will be a Deployment step. The default environments are: test, staging, or production. To set the deployment environment of a step or stage, include the Environment name.

For details on:

Propertydeployment

Required — No

Data type — String

Allowed values — The name of a Deployment environment

Allowed parent propertiesstep and stage

Example — using deployment to set the deployment environment for a step

1 2 3 4 5 6 7 pipelines: default: - step: name: Deploy to production deployment: production env 1 script: - python deploy.py prod_env_1

Example — using deployment to set the deployment environment for stages

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 pipelines: default: - stage: name: Build and test deployment: staging steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh - stage: name: Deploy to Production deployment: prod trigger: manual steps: - step: name: Build app script: - sh ./build-app.sh - step: name: Run unit tests script: - sh ./run-tests.sh

Additional Help