New to Bitbucket Cloud? Check out our get started guides for new users.
New to Bitbucket Cloud? Check out our get started guides for new users.
A workspace contains projects and repositories. Learn how to create a workspace, control access, and more.
Whether you have no files or many, you'll want to create a repository. These topics will teach you everything about repositories.
Pipelines is an integrated CI/CD service built into Bitbucket. Learn how to build, test, and deploy code using Pipelines.
Learn how to manage your plans and billing, update settings, and configure SSH and two-step verification.
Learn how to integrate Bitbucket Cloud with Jira, Marketplace apps, and use the Atlassian for VS Code extension.
Learn everything you need to know about how to build third-party apps with Bitbucket Cloud REST API, as well as how to use OAuth.
Access security advisories, end of support announcements for features and functionality, as well as common FAQs.
Become a member of our fictitious team when you try our tutorials on Git, Sourcetree, and pull requests.
Projects makes it easier for members of a workspace to collaborate by organizing your repositories into projects.
Bitbucket is great for keeping your source files and managing changes, especially when working on a team, but it's not the best place to keep large binary files, such as executables and other build artifacts, or videos and other media files.
We recommend repositories be kept under 2.0 GB to help ensure that our servers are fast and downloads are quick for our users. Bitbucket Cloud repositories have a 4.0 GB limit. When that limit is reached, you will only be able to push changes that undo the latest commits.
The repository limit for free repositories is 2 GB; however, at this time, we are not enforcing this limitation. Note that both enforcement of the limit and the limit itself are subject to change in the future.
Repository size limits
If the size of a repository is over these limits, you’ll see a warning in the Repository details panel.
Over 2.0 GB limit
You’ll get a warning on the command line if you push commits while over the 2.0 GB limit:
If you see this warning you should perform maintenance on the repository to remove large files; see Removing large files below.
Over 4.0 GB limit
If the size exceeds the 4.0 GB limit you will not be able to push any more commits:
The warning message is displayed following the first push attempted after the repository has exceeded the 4 GB limit. To continue making changes you’ll need to undo the last commit; see Undoing the last push below. This will bring the size below the 4.0 GB limit and remove the push restriction, allowing you to perform maintenance on the repository.
Over 10.0 GB limit
Above 10.0 GB all changes are rejected. You will need to raise a support request to get help reducing the size of your repository.
Undoing the last push
To remove large files you need to rewrite history; otherwise, Git just keeps the large files in the history.
Before doing these steps, you can enable the Delete dangling commits when over size limit feature in Labs to trigger garbage collection automatically.
This will delete the large files you remove from history when you rewind your branch head, avoiding the need to submit a Support request.
Rewind history to undo large commits
Rewind the branch containing the bad commit to just before that commit, and then prune the last commit to bring the repository size under the repository size limit. This process is assuming that the bad commit is only on one branch and hasn’t been merged to other branches.
It is essential to inform anyone else using the affected branch that you are undoing the last commit, and not to merge that commit into any other branches.
Remote history in Bitbucket looks similar to the following example:
Local history might look similar to the following example, assuming you only need to undo one commit:
To rewind history on the branch containing the bad commit:
Create a temporary branch to keep any local commits.
Reset the original branch to the commit just before the bad commit containing the large files.
Push the new head to Bitbucket (rewriting history).
Restore local changes – you won’t be able to push these yet, you’ll need to remove any large files first.
Using the --soft option will keep any commits you haven't pushed yet and any changes you haven’t committed, so you can push them later once you’ve removed any large files. If you don’t have changes you want to keep, you can use the --hard option and skip Steps 1, 4, and 5 in the command line example above.
Run garbage collection to delete dangling commits
If you enabled the Labs feature (see tip above), the bad commits will be deleted by automatic garbage collection and your repo size should be reduced in a few minutes.
Otherwise you'll need to raise a Support request in order to run garbage collection to delete the bad commits that are no longer in the history and reduce the repository’s size.
Remote history in Bitbucket should now look similar to the following example:
Once that’s done pushes will work again, so you can go ahead and reduce the repository size to below the 2.0 GB limit by removing large files; see Removing large files below.
Removing large files
Once pushes are unblocked, you can go ahead and remove large files from the repository. Below are some resources to help you maintain your Git repository and provide more information about using Git LFS.
Maintaining a Git repository – how to remove large files from a Git repository.
If you want to keep a lot of large files without paying for extra LFS storage you’ll need to put them elsewhere; see Options for storing large files below for a few of the available options.
Once large files have been removed, it is a best practice for everyone using the repository to make a new clone; otherwise, if someone does a force push, they will push the large files again and you’ll be back to where you started.
Avoiding large commits
As mentioned above, we recommend repositories be kept under 2.0 GB to help ensure that our servers are fast and downloads are quick for our users. Here are a couple things you can do to avoid the following error when trying to push a large commit: remote: fatal: pack exceeds maximum allowed size.
If you already tried and failed to push then reset the last commit and try again.
git reset --mixed COMMIT-SHA (this will remove those large files from the repo Index and make them un-staged but still available locally)
git status (will show what changes are currently unstaged)
make smaller changes by using `git add file` and commit with smaller commits
push each smaller commit to origin
Set up and use Git LFS. Learn more about using Git LFS with Bitbucket Cloud.
Avoid adding large files
There are a couple of things you can do to avoid accidentally adding large files to your repository:
Tell Git to ignore the kinds of files you don’t want to include.
Install automation to prevent large commits from being created.
Ignore large files
You can tell Git to exclude files from commits by adding pathname patterns to a file called .gitignore in the directory containing your local repository.
In general, you may want to tell Git to ignore:
build artifacts – best to put all these in a directory, for example Maven puts them in a target directory.
IDE settings – you don’t usually want these in the repository, for example ignore the .idea directory.
dependencies – exclude caches of dependencies, for example Python’s virtualenv or node’s local packages.
media files – a git repository is not the best place to keep large audio or video files.
Block large commits
To prevent any large files getting included in commits, you can install a local hook that checks the size of files in every commit and will reject the commit if it is too large.
Copy the check_added_large_files script into the repository. You can modify the script to work however you want. Then you can inform anyone using the repository to add a link to that script in their local clone as a pre-commit hook:
You can change the copy of the script in your repository to work however you want, and everyone will get the latest logic when they pull.
Options for storing large files
Your Bitbucket repository is the best place to keep source files. There are better places to keep other files generated from that source. Here we give some examples but encourage you to explore all the available options.
Use an artifact repository
There are many services that store build artifacts, two popular examples being:
Configure your build process to upload build artifacts to these repositories so they can be shared. Also, make sure .gitignore is configured to exclude build artifacts from commits.
Use a Docker repository
If your Bitbucket repository is used to build executables, consider building them as Docker images.
Use AWS S3
Rather than storing large media files in your Bitbucket repository, upload them to an S3 bucket where they can be easily downloaded.
Use Git LFS
If the files really need to be part of the Bitbucket repository, use the Large File Storage available on your plan. You can buy more storage if required.
Tell Git to use LFS for specific types of file using a wildcard pattern:
For example, to use LFS for MP4 movies:
*The quotes in the example above are important.
Use BFG to migrate a repo to Git LFS by moving existing large files into more efficient Large File Storage, reducing your repository size and giving a better Bitbucket experience.
Was this helpful?