If you have a self hosted Atlassian suite (Bitbucket, Confluence or Jira) you might want to consider backing up the data to an external repository in-case your server crashes.

Requirements:

  1. 1 x AWS S3 bucket
  2. AWS programmatic account that has access to the AWS S3 bucket
  3. AWS CLI installed on the Linux server and configured with a AWS programmatic account
#!/bin/bash

host=atlassian
bucket=yourbucketnamegoeshere
logs=/var/logs/backuplogs
tmp=/tmp/backup.cache
date=(`date +%d-%m-%Y`)
year=(`date +%C%g`)
month=(`date +%B`)
day=(`date +%d`)
RESULT=$?
function funcerrlogs

{
RESULT=$?
if [ $RESULT -ne 0 ]; then
echo $DATE >> $logs/logs-$date.txt
echo failed $RESULT >> $logs/logs-$date.txt
else
echo success
fi
}

#Make folders if they don't exist
mkdir -p /tmp/backup.cache
mkdir -p /var/logs/backuplogs

#Backup Bitbucket Repositories
echo Backing up Bitbucket Repositories

#Tar up data
tar cvzfP $tmp/atlassian-repos-$date.tar.gz /var/atlassian/application-data/bitbucket/shared/data/repositories

#If error log in log function
funcerrlogs

#Transfer tar to AWS S3
aws s3 cp $tmp/atlassian-repos-$date.tar.gz s3://$bucket/$year/$month/$day/$host/atlassian-repos-$date.tar.gz

#Remove tar locally
rm -rf $tmp

You will need to change lines 2 and 3 to match your requirements.

This will backup all Atlassian suite data, if the server crashes you can build a new server and put the data back in the same location.

It will nicely put the tarball file in a structured year/month/day format, to make this the most effective I would recommend creating a cron job for this to run automatically based on a schedule suitable for you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.