We've gathered together some examples of how people use Bitbucket Pipelines to give you some inspiration. There are descriptions of how you might structure your yaml file, some live repositories you can look at to see pipelines working, and a collection of posts by other people around the globe, talking about how they use Bitbucket Pipelines.
Example bitbucket-pipelines.yml files
Generally all commits happen on the development branches that trigger the
default pipeline. This pipeline builds and tests and, if successful, automatically runs the script to deploy to your development environment.
Then you can merge all code onto a single
release branch, and then progressively deploy at the click of a button (and you can automatically deploy by removing the
trigger: manual option).
image: <Whatever Docker build image you need> pipelines: branches: release: # This pipeline has 3 steps. - step: name: Build & test script: - ./run-build - ./run-tests artifacts: # This will carry across the files you want to deploy to all subsequent steps - ~/build/files-to-deploy/* - step: name: Deploy to Test deployment: test trigger: manual # This step needs to be started on the UI. Remove this line to have it run automatically. script: - ./deploy-to-test - step: name: Deploy to Staging deployment: staging trigger: manual # This step needs to be started on the UI. Remove this line to have it run automatically. script: - ./deploy-to-staging - step: name: Deploy to Production deployment: production trigger: manual script: - ./deploy-to-production default: - step: name: Build & test script: - ./run-build - ./run-tests
- Branch workflows: https://confluence.atlassian.com/bitbucket/branch-workflows-856697482.html
- Deployments: https://confluence.atlassian.com/bitbucket/bitbucket-deployments-940695276.html
- Artifacts: https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html
- Manual triggers (see the 'Manual Step' section): https://confluence.atlassian.com/bitbucket/run-pipelines-manually-861242583.html
- bitbucket-pipelines.yml configuration: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html
If you use different branches for staging and development, and would like to trigger a deploy to the staging server on a push to the
staging branch, and the deploy to the production server on push to the
With this configuration:
- pushes to
masterwill run the master branch pipeline and deploy to production
- pushes to
stagingwill run the staging branch pipeline and deploy to staging
- pushes to any other branch will run the default pipeline which only runs the build and test step
pipelines: default: # the default pipeline will just run your build and test commands - step: name: Build and test script: - ./run-build - ./run-tests branches: staging: # this staging branch pipeline will deploy to the staging environment - step: name: Build and test script: - ./run-build - ./run-tests artifacts: # This will carry across the files you want to deploy to all subsequent steps - ~/build/files-to-deploy/* - step: name: Deploy to Staging deployment: staging script: - ./deploy-to-staging master: # this master branch pipeline will deploy to the production environment - step: name: Build and test script: - ./run-build - ./run-tests artifacts: - ~/build/files-to-deploy/* - step: name: Deploy to Production deployment: production script: - ./deploy-to-production
Because the deployment steps have been marked with their respective environments (using the
deployment keyword) you'll also be able to track the status of those environments on the deployments dashboard in Bitbucket.
The yml file is read from the branch on which the push occurs so be sure to merge this configuration to both staging and master branches.
There's also some duplicate step definitions in here that can be cleaned up using shared scripts or yml anchors.
A library that uses pipelines to create a Docker image and release it.
This is a real world example of how pipelines is used to release version updates (if you want to read more about what this library actually does read the author's post on this indexer sidecar for Elasticsearch).
Other people's guides
We love it when we see people explore the possibilities that pipelines offer and share their experience. Here's some guides from people outside of Atlassian:
- Building a Bitbucket Pipe as a casual coder
- Make an app in expo, test it, and then deploy it
- Build and deploy maven artifacts to CloudRepo
- How to publish Ruby Gems from Bitbucket Pipelines
- Deploy your Angular App to Firebase hosting via BitBucket Pipelines
- Continuous Deployment Pipeline to AWS EC2 using AWS Code Deploy
- Deploy to GCP Cloud Run with Bitbucket Pipelines