Pipelines build hung while building a docker image

Platform Notice: Cloud Only - This article only applies to Atlassian products on the cloud platform.

Summary

The Pipelines build hung during the building of a Docker image via the "docker build" command, failing due to exceeding the time limit.

Environment

Bitbucket Cloud Pipelines

Diagnosis

  1. Rerun the last succeeded build with the same bitbucket-pipelines.yml config but with older commits.

    • If the build is completed successfully, this indicates that the change on the latest commit may have impacted the build outcome.

  2. Compare the changes between the failed build commit and the last succeeded build commit

    • This can be done by navigating to the Repository source page > Click on more icon

      (Auto-migrated image: description temporarily unavailable)

      (Top right) > Comparebranches and tags .

    • ℹ️Common things to note are the commands in the "Dockerfile" and the environment/packages configs added to the latest commit.

  3. Review bitbucket-pipelines.yml config to ensure that the docker service has enough memory to complete

Cause

In most cases, the issue occurs when the docker service does not have enough memory to execute the commands within the Dockerfile.

An example of an actual case:

  • The team configures an 800MB memory limit for their Pipelines' docker service.

  • The team had decided to expand their Ruby dependencies with the "gem/gem.lock" file and to be included in their docker images with the "RUN bundle install" command.

  • With many dependencies to be added, the 800MB docker service memory is no longer sufficient for the build to complete the "RUN bundle install" command in Dockerfile.

Solution

This can be resolved by increasing the docker service memory limit on the bitbucket-pipelines.yml config based on the need of the build.

ℹ️ Please note that increasing step memory is different compared to increasing the service memory, do check out Databases and service Container (Bitbucket Cloud Documentation) for more details.

1 2 3 4 5 6 7 8 9 10 11 default: - step: services: - docker script: - echo "This step is only allowed to consume 1024 MB of memory" - echo "Services are consuming the rest. docker 3072 MB" definitions: services: docker: memory: 3072 # Increase memory for docker service to 3GB
Updated on April 2, 2025

Still need help?

The Atlassian Community is here for you.