"error: index-pack died of signal 25" while pushing to Bitbucket server

Platform Notice: Data Center Only - This article only applies to Atlassian products on the Data Center platform.

Note that this KB was created for the Data Center version of the product. Data Center KBs for non-Data-Center-specific features may also work for Server versions of the product, however they have not been tested. Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.

*Except Fisheye and Crucible

Summary

Unable to import very large-sized Github repository to Bitbucket server via the terminal

Environment

Linux

Diagnosis

  • The repository size could be very big in size say 40 GB because of the large binary files, or maybe the temporary file during the upload is large.

  • While pushing to Bitbucket, you're gettingfailed to push some refs to <SSH URL>error.

  • The clone from Github is successful using the steps here: Import code using the terminal, but the push to Bitbucket fails.

  • Try pushing both via HTTPS and SSH URL to see the errors.

  • Turn of Git debug logging, you will get errors like the below:

1 2 3 4 error: index-pack died of signal 25 error: remote unpack failed: index-pack abnormal exit error: failed to push some refs to 'ssh://git@tvsbitbkt.tvsmotor.com:7999/aeg-aosp/dummy.git'
  • Bitbucket logs will have errors like the below:

1 2 DEBUG [http-scmrequest-handler:thread-10] bbadmin @776WURx814x734207x0 10.121.4.67 "POST /scm/<project-key>/<repo-name>.git/git-receive-pack HTTP/1.1" c.a.s.i.s.g.p.h.GitSmartExitHandler <PROJECT-KEY>/<repo-name>[<repo-id>]: Write request from 10.121.4.67 succeeded, but the following was written to stderr: error: index-pack died of signal 25

Good to know

When you do a push, does it create a temporary file somewhere and is that file then unpack it or something?

Pack data is streamed to a file on disk because it can't possibly be safely held in memory--it's unknown how large the pack data will be until it's received. Git writes it to a file in the objects/packdirectory for the repository, with a name starting with tmp_pack. In parallel, it's indexing that pack data and writing that to a temporary file. When the push completes successfully, the pack is renamed to its final name (or, if it's very small, it's unpacked into loose objects). By default, Git does not limit the size of individual packs. For example, the Jira Cloud repository is currently 17GB on disk and 14GB of that is in a single pack file. It's possible to configure Git to limit that, but it's not the default, it's not recommended (by the Git developers or by us) and it's not a supported configuration to change (so I wouldn't bring it up with the customer)

Cause

The issue is caused by the file size limit on your Linux server, where Bitbucket is hosted.

Solution

Check the ulimit values in your environment using the command below:

1 2 3 4 5 6 7 8 9 10 $ ulimit -a -t: cpu time (seconds) unlimited -f: file size (blocks) unlimited -d: data seg size (kbytes) unlimited -s: stack size (kbytes) 8192 -c: core file size (blocks) 0 -v: address space (kbytes) unlimited -l: locked-in-memory size (kbytes) unlimited -u: processes 2784 -n: file descriptors 2560

Make sure that in your system the value for file size is set as unlimited:

-f: file size (blocks) unlimited

To do that, switch to root user using sudo su - command and run the below command:

ulimit -f unlimited

Source: https://ostoday.org/linux/how-do-i-change-the-ulimit-parameters-in-linux.html 

Updated on February 28, 2025

Still need help?

The Atlassian Community is here for you.