Bitbucket Pipeline build error "Container 'docker' exceeded memory limit"
Platform Notice: Cloud Only - This article only applies to Atlassian apps on the cloud platform.
Summary
Your Pipelines build fails. The following error shows on the build page, right next to the build log:
"Container 'docker' exceeded memory limit."
Cause
The docker service container in your Pipelines step requires more than the currently allocated memory.
Please note that if your step uses a pipe, a docker service is added to your step for the pipe to run. You may not have defined a docker service for your step in your bitbucket-pipelines.yml file, but if the step uses a pipe, then it also uses a docker service.
Solution
Allocate more memory to Docker service
The docker service gets allocated 1024 MB of memory by default. If your build fails with the error mentioned in the Summary, then you'll need to allocate more memory to the docker service.
Please note that increasing the Pipeline build step size to 2x, 4x, 8x, or 16x (using the size attribute) will not automatically increase the memory allocated to the Docker service container. The Docker service will still be allocated 1 GB of memory by default, regardless of the step size, unless you configure and allocate more memory to it explicitly in the bitbucket-pipelines.yml file (an example is given below).
You can use the following definition in your bitbucket-pipelines.yml file to increase the memory allocated to the docker service:
definitions:
services:
docker:
memory: 3072 # Memory in MB - allocate 3GB (3072 MB) of memory to docker service
If your step is using a pipe, you will also need to add a docker service definition to the step so that the memory limit will be applied:
definitions:
services:
docker:
memory: 3072 # Memory in MB - allocate 3GB (3072 MB) of memory to docker service
pipelines:
default:
- step:
services:
- docker
script:
- echo "Build container is allocated 2048 MB of memory"
- pipe: atlassian/git-secrets-scan:3.1.0
Maximum memory that can be allocated to the Docker service
The maximum memory that can be allocated to the Docker service depends on the size of your step and whether your step is using any other services. It is equal to the step's memory minus 1024 MB (required by the step's build container) minus the memory of any other services of your step.
If your step is not using any other services, the Docker service can be configured to use up to:
3 GB (3072 MB) of memory in regular 1x steps.
7 GB (7168 MB) of memory in 2x steps.
15 GB (15360 MB) of memory in 4x steps.
31 GB (31744 MB) of memory in 8x steps.
63 GB (63488 MB) of memory in 16x steps.
It's also possible to configure multiple docker services with different memory limits if you have some specific requirements.
If you still experience issues, you can raise a Support Ticket and share the Pipeline build URL from bitbucket.org.
Was this helpful?