Get started with Bitbucket Cloud
New to Bitbucket Cloud? Check out our get started guides for new users.
Premium only
Pipelines configuration sharing is a Bitbucket Cloud Premium feature. Learn more about Bitbucket Premium
Being able to share your YAML configuration in Bitbucket Pipelines allows you to define and share pipeline definitions within the same workspace, which enables you to streamline your pipeline definitions by not having to repeatedly create each pipeline configuration.
Create a repository in the Workspace where you’ll be adding the shared pipeline definition. For more information on creating a repository, refer to our Create a Git repository help document.
Create a bitbucket-pipelines.yml file with export: true at the top-level property on the file and put your Pipeline definition under definitions > pipelines section of the file.
All pipelines defined under the definitions > pipelines section will be exported and can be imported by other repositories in the same workspace. Check out the example below.
1
2
3
4
5
6
7
8
9
export: true
definitions:
pipelines:
share-pipeline-1:
- step:
name: "hello world"
script:
- echo hello
A good way to separate ‘exported’ pipelines from other pipelines is by including them under the definitions section, which allows you to separate the Pipeline definitions you want to export from the Pipelines you want to run on the actual repository itself.
For example - you may run some basic .yaml validation logic in a Pipeline against the repository containing the exported .yaml, but not want to actually export that .yaml validation logic to other repositories for them to consume.
3. In the repository where you want to reuse the pipeline definition, you need to use the import property in your Pipeline at the location you want to import the shared pipeline configuration.
1
2
3
4
pipelines:
custom:
import-pipeline:
import: shared-pipeline:master:share-pipeline-1
The import statement uses the following structure {repo-slug-of-exported-pipeline}:{branch/tag-to-import-from}:{pipeline-from-exported-file}
The import statement will always reference the HEAD commit of the targeted branch.
The import statement will always to exact match for pipeline name, glob pattern match is not supported.
If you want to reference your exported .yaml at particular points in time for using in import statements, you can use a tag instead of the branch name to reference a specific commit rather than the HEAD if the branch.
You can now trigger import-pipeline in your repositories, and it will use the configuration from the exported .yaml.
Important Note: Users can execute Pipelines that import configurations from repositories they do not have direct access to, if that Pipeline configuration has explicitly been declared as export: true.
Shared Pipeline configs follow a different access control scheme than the one used for users.
This is done to avoid complex scenarios where a user doesn’t have access to a repo where a shared Pipeline config is stored, leading to their builds failing due to permissions issues.
We treat the importing of an external Pipeline as an action taken by the repository, not the user. The Pipeline is considered to be “shared” with all repo's in that Workspace if it has been set to export: true.
Once you mark a Pipeline .yaml file with export: true, assume that any repository in the same workspace can view the contents of that Pipeline configuration file, as the contents may be inferrable via things like build logs etc.
Do not store sensitive information directly within the configuration of exported Pipelines. Always leverage variables and secrets for managing sensitive information as these values are inherited from the importing repository.
Note: If a user attempts to navigate via the UI to the source file where a shared Pipeline config is stored and they do not have permission to view the repository, they will not be able to view that source file in the UI.
Question: Can I share Pipeline Configurations across different Workspaces? Are my shared Pipeline Configurations secure from outside my Workspace?
Answer: Pipeline Config sharing is scoped to the Workspace in which Pipeline Configurations are exported from. Exported Pipeline Configurations cannot be accessed by repositories that are not in the same Workspace.
Question: How do I maintain the stability of the configuration when importing from another repo? What happens if the config in the exporting repo changes and breaks my build?
Answer: Pipelines Config import statements support pinning your import to either a Git Branch or a Git Tag.
When targeting a branch, the import will always pull from the HEAD of that branch, meaning you will get the latest version of the exported Pipeline Configuration, we recommend using this model if you want to maintain currency with any upstream changes.
When targeting a Tag, the Pipeline Configuration will always be imported from the Commit referenced by the Tag pinned. This allows customers to create static point-in-time references and version their Pipeline Configurations. In the future, we may add support for pinning to specific commits if this is something users see value in.
Question: What happens if a Pipeline is rerun and the imported configuration has changed since the first time it was run?
Answer: If the entire Pipeline is rerun, the Pipeline Configuration will be reimported at the point in time that the rerun executes. If the configuration has changed since the initial run, the executed configuration will reflect the new configuration.
Question: What happens if a Step is rerun and the imported configuration has changed since the first time it was run?
Answer: If a single step is rerun, the configuration for that step will be exactly the same as the first time the step was run. To reiterate, this is different to the behavior when the entire Pipeline is rerun.
Question: How are variables and secrets handled when importing Pipeline Configurations?
Answer: All the variables and secrets from the importing repository will be available to the build, including when that build is executing an imported Pipeline Configuration. If the imported configuration utilizes variables in its scripts, and those variable values are made available by the repository it was imported by, then those variable values will be executed in the build.
Variable values from the exporting repository are not shared or reused in importing repositories.
Note: There is a known limitation with using Variables in Custom Pipelines that utilize a shared Pipeline Configuration. If a repository imports a shared Pipeline Configuration and uses it as part of a Custom Pipeline, variable values specified at runtime via the UI will not pass to the pipeline execution. Instead, the default values specified in the exported Pipeline configuration will be used instead.
Question: Can I customize elements of the imported Pipeline config like Step Size, Max Time, Image, etc?
Answer: The only way to configure the imported Pipeline Configuration is via Variables and Secrets. However, we are excited to be working on some really exciting capabilities that are going to grant customers an unprecedented level of flexibility with sharing and configuring Pipelines in the future, so stay tuned.
Question: Can individual steps be shared?
Answer: Yes, individual steps can be shared. See the example provided in the following community post for more information: Share a step across pipelines.
Question: How can I test shared pipeline configurations in shared-pipeline if it is stored in the definitions section?
Answer: You can use a yaml anchor to do it, here is an example:
1
2
3
4
5
6
7
8
9
10
11
12
export: true
definitions:
pipelines:
share-pipeline-1: &share-pipeline-1
- step:
name: "hello world"
script:
- echo hello
pipelines:
custom:
test-share-pipeline-1: *share-pipeline-1
Question: What is the precedence for definitions like caches or services if I have it defined them in 2 different repositories with same name?
Answer: For global options( image, clone, max-time, size, docker) and definitions like caches, services, we always them from the exporting repository and not the importing repository.
For example, if you have a bitbucket-pipelines.yml file like this in a repository:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
export: true
image: atlassian/default-image:3
definitions:
services:
docker:
memory: 1024
redis:
image: redis:6
memory: 512
caches:
my-custom-cache-without-key: cache
my-custom-cache:
key:
files:
- cache-key.txt
path: cache
pipelines:
export-pipeline:
- step:
services:
- docker
- redis
script:
- ls -la
- ls -la cache/ || true
- echo $BITBUCKET_BUILD_NUMBER > artifact.txt
- echo $BITBUCKET_BUILD_NUMBER > cache/cache.txt
artifacts:
- artifact.txt
caches:
- my-custom-cache
- my-custom-cache-without-key
And bitbucket-pipelines.yml in the importing repository:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
image: maven:3.9-eclipse-temurin-11
definitions:
services:
docker:
memory: 2048
redis:
image: redis:7
memory: 1024
caches:
my-custom-cache:
key:
files:
- local.txt
path: local
my-custom-cache-without-key: local-path
pipelines:
default:
import: export_repo:master:export-pipeline
The default pipeline triggered in importing-repo will use docker and redis service defined in shared-pipeline, with redis:6 image for redis service. And my-custom-cache, my-custom-cache-without-key from the exporting repository and not the importing repository.
Also, the step build image is atlassian/default-image:3 not maven:3.9-eclipse-temurin-1.
Was this helpful?