Sunday 16 July 2017

SQL Backup To AWS S3


This blog I am going to help you on moving your SQL backups to AWS S3.
Below are the steps we are going through in this blog.
  1. Creating S3 bucket in your Aws account.
  2. Creating IAM accounts in AWS security.
  3. Installing AWS CLI in your database server.
  4. Moving backups from On-Premise to S3 using PowerShell.
  5. Moving backups from AWS EC2 to S3 using PowerShell.

First thing you need is an AWS account.



Creating S3 bucket in your Aws account
  • ·         Login to AWS console.
  • ·         Click on storage service S3.


  • ·         Click on Create bucket.
  • ·         Fill bucket name, region where you want to place this storage and click next. 

  • ·         Set properties of the bucket. Enable versioning property and click on next.


  • ·         Give read and write permissions to the user or owner of the bucket and click next.

  • ·         Review the bucket details and click on create bucket.


  • ·         New S3 bucket has been created.

   

     Creating IAM accounts in AWS security
  • ·         Click on IAM service in security.
  • ·         Select Users on left pane of IAM dashboard.

  • ·         Click add user.

  • ·         Provide username, enable programmatic access and click on next.

  • ·         Add user to Admin group and click on next.

  • ·         Review user details and click on create user.
  • ·         This will create user and provide Access Key ID and Secret Access Key. Make sure you make a note of them for future use on AWS CLI access.


    Installing AWS CLI in your database server

       Download and install AWS CLI on database server where backup files are located.



    Moving backups from On-Premise to S3 using PowerShell


   Step 1:
   Configure AWS CLI in database backup server. 
   Open command prompt and configure AWS CLI as below


   Provide Access Key ID and Secret Access Key which you saved while creating IAM user.

  Step 2:
  Save the below PowerShell script at C:/Scripts/S3_copy.ps1

  Provide bucket name and Backup location in the script.

import-Module -name AWSPowerShell

$BucketName = "aws-s3-sql-backup"
$s3Directory = " C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup"
$concurrentLimit = 50
$inProgressFiles = @()

foreach ($i in Get-ChildItem $s3Directory)
{
  # Write the file to S3 and add the filename to a collection.
#  Write-S3Object -BucketName $BucketName -Key $i.Name -File $i.FullName
if (!(Get-S3Object -BucketName $BucketName -Key $i.Name)) {
    Write-S3Object -BucketName $BucketName -Key $i.Name -File $i.FullName
}
  $inProgressFiles += $i.Name

  # Wait to continue iterating through files if there are too many concurrent uploads
  while($inProgressFiles.Count -gt $concurrentLimit)
  {
    Write-Host "Before: "$($inProgressFiles.Count)

    # Reassign the array by excluding files that have completed the upload to S3.
    $inProgressFiles = @($inProgressFiles | ? { @(get-s3object -BucketName $BucketName -Key $_).Count -eq 0 })

    Write-Host "After: "$($inProgressFiles.Count)

    Start-Sleep -s 1
  }

  Start-Sleep -s 1

  
  Step 3:
  •      Open your SSMS and create a backup job with below two steps.
  •      Job step 1 - Takes backups of user DB’s.
  •      Job step 2 - Copy recent backups from database server backup location to AWS S3 bucket.



   Properties of Step 2:

   This will kick the PowerShell script placed in C drive.

    Please check your S3 bucket once job gets completed.

   Note:  Before running the PowerShell script make sure you are remotely signed using below execution policy in your database server.

     
     Moving backups from AWS EC2 to S3

  You can use the same script in AWS EC2 machine of SQL server and copy backup files to S3.