Pipeline build failed with Container “Docker” exceeded memory limit error
Platform Notice: Cloud Only - This article only applies to Atlassian products on the cloud platform.
Summary

Solution
Possible causes:
The docker service container in the Pipeline build requires more than the currently allocated memory.
Troubleshooting Steps:
Was the pipeline YAML already configured to allocate more than the default 1GB of memory to the docker service?
Please note that increasing the Pipeline build step size to 2x, 4x, or 8x (using the size attribute) will not automatically increase the memory allocated to the Docker service container. Docker service with size 2x, 4x, or 8x will still be allocated 1GB of memory by default, unless you configure and allocate more memory explicitly in the YAML file (an example is given below)
YES
Raise a Support Ticket and share the Pipeline build URL from bitbucket.org
NO
A custom memory value can be configured to the docker service to increase it further than 1GB by adding the below definition in your YAML file :
1 2 3 4
definitions: services: docker: memory: 3072 # Memory in MB - allocate 3GB (3072MB) of memory to docker service
Docker service can be configured to use up to:
3 GB (3072 MB) of memory in regular 1x steps.
7 GB (7168 MB) of memory in 2x steps.
15 GB (15360 MB) of memory in 4x steps.
31 GB (31744 MB) of memory in 8x steps.
Pipes internally use the docker service. The memory-related issues with pipes can also be fixed by assigning more memory to the Docker service.
It's also possible to configure multiple docker services with different memory limits if you have some specific requirements.
Was this helpful?