Generate support logs in Bitbucket Cloud Pipelines
Platform Notice: Cloud Only - This article only applies to Atlassian products on the cloud platform.
Summary
This KB covers generating support logs in Bitbucket Pipelines using artifacts to troubleshoot issues and monitor processes effectively.
Using artifacts
If you encounter an issue using pipelines, the logs can give you great information on what might happen. Rather than try to read through all the logs created, you can use the power of artifacts to create specific logs, which you can then download.
For example, you can get a snapshot of what processes are running in your pipeline by adding the lines below to your bitbucket-pipelines.yml
. This is useful if you’re trying to see if a process has hung or is using too much memory.
bitbucket-pipelines.yml
pipelines:
default:
- step:
artifacts:
- process-logs.txt #declaring that you want to keep this as an artifact
script:
- while true; do date && ps aux && echo "" && sleep 30; done >> process-logs.txt &
- # The rest of the script.
This will log the process information to process-logs.txt
file, instead of the build logs.
Once the pipeline step completes, you can download the artifact from the Artifacts tab on the Result Page.
Please remember that artifacts have a size limit of 1 GB and are stored for 14 days after executing the step that produced them. After this time, artifacts are no longer available.
Solution
Other examples
Container memory monitoring
This is useful if you are getting out of memory errors, or to check on how much memory your pipeline uses:
bitbucket-pipelines.yml
pipelines:
default:
- step:
artifacts:
- memory-logs.txt
script:
- while true; do echo "Memory usage in megabytes:" && echo $((`cat /sys/fs/cgroup/memory.current | awk '{print $1}'`/1048576)) && echo "" && sleep 30; done >> memory-logs.txt &
- # The rest of the script.
Docker events
If you’d like to collect more information about your Docker events, you can log the output of the docker events command:
bitbucket-pipelines.yml
pipelines:
default:
- step:
services:
- docker
artifacts:
- docker-event-logs.txt
script:
- docker events >> docker-event-logs.txt &
- # The rest of the script.
Combined logs
You can set up a selection of logs at the same time. This example shows 3 logs being extracted.
bitbucket-pipelines.yml
pipelines:
default:
- step:
services:
- docker
artifacts:
- process-logs.txt
- memory-logs.txt
- docker-event-logs.txt
script:
- while true; do date && ps -aux && sleep 30 && echo ""; done >> process-logs.txt &
- while true; do echo "Memory usage in megabytes:" && echo $((`cat /sys/fs/cgroup/memory.current | awk '{print $1}'`/1048576)) && echo "" && sleep 30; done >> memory-logs.txt &
- docker events >> docker-event-logs.txt &
- # The rest of the script.
Build Your Own
These examples are just the beginning. You can combine a variety of system commands to get detailed logs for your situation. We recommend keeping the log file size below 1GB for ease of viewing and to avoid artifact size limits.
Notes
While secured variables output in the UI logs are always masked, if you need to investigate any variables, don't redirect them to a file. Redirecting them to a file will reveal them.
For more information about artifacts, check out our documentation.
Was this helpful?