If you have a self hosted Atlassian suite (Bitbucket, Confluence or Jira) you might want to consider creating a backup of the data to an external repository such as AWS S3 in-case your server fails.


This script creates a tar of all your Atlassian data and uploads it to AWS s3 using the aws cli package. 


  1. 1 x AWS S3 bucket
  2. AWS programmatic account that has writeable access to the AWS S3 bucket
  3. AWS CLI installed on the Linux server and configured with a AWS programmatic account



date=(`date +%d-%m-%Y`)
year=(`date +%C%g`)
month=(`date +%B`)
day=(`date +%d`)

#Backup Bitbucket Repositories
echo Backing up Bitbucket Repositories

#Tar up data
tar cvzfP $tmp/atlassian-repos-$date.tar.gz /var/atlassian/application-data/bitbucket/shared/data/repositories

#Transfer tar to AWS S3
aws s3 cp $tmp/atlassian-repos-$date.tar.gz s3://$bucket/$year/$month/$day/$host/atlassian-repos-$date.tar.gz

#Remove tar locally
rm -rf $tmp


You will need to change lines 3 and 4 to match your requirements.

By default data is located in the “/var/atlassian/appliction-data” however you will need to change this location if it isn’t.

It creates the following folder structure in the S3 bucket: /year/month/day/host

Add this to a cron for automated results.


By setting up this script to run in a cron you will have the data in at least two places and depending on your cron schedule a lot of points to restore from.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.