Get started with Bitbucket Cloud
New to Bitbucket Cloud? Check out our get started guides for new users.
Artifacts are files that are produced by a step. Once you've defined them in your pipeline configuration, you can share them with a following step or export them to keep the artifacts after a step completes. For example, you might want to use reports or JAR files generated by a build step in a later deployment step. Or you might like to download an artifact generated by a step, or upload it to external storage.
There are some things to remember:
Files that are in the BITBUCKET_CLONE_DIR at the end of a step can be configured as artifacts. The BITBUCKET_CLONE_DIR is the directory in which the repository was initially cloned.
You can use glob patterns to define artifacts. Glob patterns that start with a * will need to be put in quotes. Note: As these are glob patterns, path segments “.” and “..” won’t work. Use paths relative to the build directory.
Artifact paths are relative to the BITBUCKET_CLONE_DIR.
Artifacts that are created in a step are available to all the following steps.
Artifacts created in parallel steps may not be accessible to other steps within the same group of parallel steps. If another step in the parallel group requests the artifact, it may or may not exist when it's requested.
Artifacts will be deleted 14 days after they are generated.
By default, all available artifacts will be downloaded at the start of a step. You can control whether a step downloads artifacts by specifying the download flag.
In the example bitbucket-pipelines.yml file that follows, we show how to configure artifacts to share them between steps.
When the script for 'Build and test' completes, all files under the dist folder and the txt files in the report folder (both found under the BITBUCKET_CLONE_DIR) are kept as artifacts, with the same path.
'Integration test' and 'Deploy to beanstalk' can access files in dist and reports, created by the first step.
Any changes to dist or reports by 'Integration test' will not be available in later steps because they have not been specified as artifacts in 'Integration test'. If you wanted to keep the changes, you would need to define them as artifacts in this step, too.
Artifacts will not be downloaded and thus not available during ‘Display success message’. This step will still produce the artifact success.txt, making it available for download in later steps.
bitbucket-pipelines.yml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
pipelines:
default:
- step:
name: Build and test
image: node:10.15.0
caches:
- node
script:
- npm install
- npm test
- npm run build
artifacts: # defining the artifacts to be passed to each future step.
- dist/**
- reports/*.txt
- step:
name: Integration test
image: node:10.15.0
caches:
- node
services:
- postgres
script:
# using one of the artifacts from the previous step
- cat reports/tests.txt
- npm run integration-test
- step:
name: Deploy to beanstalk
image: python:3.5.1
script:
- python deploy-to-beanstalk.py
- step:
name: Display success message
artifacts:
download: false # do not download artifacts in this step
paths: # defining artifacts to be passed to each future step
- success.txt
script:
- echo "Deployment successful!" > success.txt
definitions:
services:
postgres:
image: postgres:9.6.4
Manual steps will have build artifacts produced by any previous steps copied into their working directory, similar to automatic steps.
You can download artifacts generated by a step:
Select the Artifact tab of the pipeline result view
Click the download icon
Artifacts are stored for 14 days following the execution of the step that produced them. After this time, the artifacts are expired and any manual steps later in the pipeline can no longer be executed.
If you need artifact storage for longer than 14 days (or more than 1 GB), we recommend using your own storage solution, like Amazon S3 or a hosted artifact repository like JFrog Artifactory. Setting a reasonable time limit for build artifacts allows us to manage our costs so we don't have to charge for storage and transfer costs of build artifacts in Pipelines.
See these pages for more information:
Was this helpful?