Random Ramblings About Making Games and Stuff from Cloud

Posts tagged ‘Azure SQL’

Azure SQL backup using Red-Gate Cloud Services

I stumbled upon Red Gate Azure Backup service: https://cloudservices.red-gate.com/

After using the beta for couple of weeks I am really impressed. Everything is easy, UI is intuitive and you don’t need lots of configuration to make this work. You can setup basic daily backup work in just matter of minutes. If you got Azure storage or Amazon S3 account ready. If not then it still does not take long. You spend most of the time digging up usernames and passwords not figuring out how things work! Splendid.

So let’s go through how to do daily Azure SQL backup into Amazon S3 storage.

Red-Gate Cloud Service Dashboard is really clean and simple. Just select the action you want to perform.

First after registering to Red-Gate Cloud Services and login select “Backup SQL Azure to Amazon S3” from Dashboard.

Just fill in the Azure SQL credentials and Amazon S3 keys.

Secondly just fill in Azure SQL server name xxxxxxx..database.windows.net, login credentials to that DB and press refresh button next to Database drop down. Select database that you want to backup. Next click on the “AWS Security Credentials” link and login to your Amazon AWS account. You will be taken directly to place where you can find the keys.

"Access key id" goes to "AWS Access Key" and "Secret Access Key" is shown after pressing "show" link in Amazon AWS.

 Note that you need S3 Bucket. If you don’t have one you can create it from here https://console.aws.amazon.com/s3/home I will not explain how that is done, but it is really easy. After you have inserted Amazon keys you can press the refresh button next to Bucket dropdown in Red-Gate UI. Just fill in the name that you want your backup file to have. Don’t worry about the day stamp because this tool will add date at the end of the filename. Press “continue” button to select scheduling options or backup now.

Just select when you want your daily backup to run or just press "Backup now" button.

 Next just fill in exact time you want your backup in 24h format, select time zone, select week days when to run this and press schedule button. You can also just press backup now button for one time backup. Note that there is also a monthly backup option.

You should see your scheduled backup in next screen. When your first backup is done it appears here under history header.

Next you are taken into schedules and history view. Here you can view details of the upcoming backup operation, cancel it or invocate backup now. Just move your mouse over the upcoming event.

Just move mouse on top of scheduled backup job and you can cancel job or invocate it to happen right now.

When backup job has been successfully completed you will receive an email and a “log” line will appear in “schedules and backup history”.

If job is successful or not it will appear here under history title

 You can view details of executed jobs by moving mouse on top of history line. If you backup job has run into errors you will see an error triangle as well as receive error message via email. Email also contains a direct link to that specific History log line. Neat!

If you backup job has exceptions it has warning icon in history. Details link will contain error message of what went wrong.

 Clicking on the link on the email will direct you to same view as clicking “details” next to completed job.

Here you can read throught what happened while executing backup. If you ran into errors they are also present. Just scroll down from the bar.

If you got your backup run cleanly you should see the created pacbac file in Amazon S3 bucket.

Backup file is safe and sound in Amazon S3

Note that you can just as easily use Azure Storage services or use FTP to upload the backup to you own backup architecture.

There are couple of missing features that I would like to see. For example similar choice than in Red-Gate backup tool to first create copy of database (for transactional safety) and ability to encrypt the backup file before it is uploaded into Amazon or Azure storages. But event without these small features this is still awesome!

Carefree clouding!

SQL Azure monitoring with Cotega

I just started to use this SQL Azure and SQL Server Monitoring tool http://www.cotega.com/ and I must say that it is exactly what I was looking for in a easy to use monitoring tool. I don’t need to be complex, feature rich and massive monitoring tool. I only need a alert when database is not accessible, if performance seems to be below average or there is a spike in the usage. Service is stil in beta but it looks really promising.

Just click "add SQL Azure database"

Setting up the database connection was easy. After login you are presented with Dashboard view. Just click “add SQL Azure database and fill in the database address, username and password to the popup. Finally press “Add database button”.

Fill in the address and login details.

You can add notifications to you databases by going to Notifications view and pressing “Create Notification” button.

To add new notification just press "Create Notification"

Just fill the name of the notification, select monitoring rule from “select what you want to monitor”, database and the polling frequency.

Fill in the details and select monitoring target.

After you have selected monitoring target you can select when to create a notification. In this case I have selected “when connection fails”. After this you can fill in a email address and even select a stored procedure to be run. But with connection fails case that does not make sense :). Finally press “Add Notification”.

Add email address and select stored procedure if you need one

There you go. Just start waiting those notifications to kick in.
There is a nice feature in the notifications: When you start to receive lots of them for example conserning “connection fails” you can temporarily disable the notification directly from the notification.

When you get many same type of Notifications you have option to disable the Notification directly from the email.

You can check the logs for specific notification to make sure that rule works.

You can see how rules are being evaluated.

Also you can view this info in a report view for performance ananlysis purposes.

Its easier to spot performance problems from visualized reports.

Very potential neat litle monitoring tool for those who don’t need complex solution for simple problem!

Azure SQL Backup and restore scenarios using bacpac export/import

What Azure SQL was missing was a proper supported way to make backups for disaster scenarios. Those scenarios would include loss of control over Azure SQL server or human error causing SQL admin to delete whole Azure SQL server. Great news everyone Azure SQL now has tools to mitigate the impact of these scenarios. SQL Azure Import/Export Service CTP is now available. More details on how this works can be found here.

What do you need?

You need means of scheduling backup, Azure Storage account where to put bacpac files, means to get the exact url of bacpac file and a Azure SQL account to backup.

Azure Account

You need one Azure account with Azure SQL and Storage server of course. 🙂

You might want to have separate backup storage account because, if you cannot access your production account you still have access to your backups. I personally download bacpac backups from Azure to local server.

Bacpac file export tool

In this example I will use Red-gate backup tool. This is because this tool allows you to easily make a database copy of your database before the backup. This will allow you to make transactionally safe backups.

But you can also use DAC SQL Azure Import Export Service Client V 1.2. In this tool you need to make a database copy using for example Cerebrata cmdlets.

Azure Storage account browser

You can use any tool you like. In this example I will use Azure Storage Explorer because it is free 🙂

Windows server

Ideally this would be dedicated windows 2008 server so that you can be sure that it runs smoothly. You can also download exported bacpac files to this server. Just to be safe 🙂

Backup using Red-Gate command line backup tool

Here is a nice how to video how to setup scripts using Red-Gate tool. NOTE that in this video script will make a backup into different database. What we want to do is schedule a bacpac file generation. If you want to test how bacpac file generation works without command line watch this video.

But back to business: your scheduled script should look something like this:

RedGate.SQLAzureBackupCommandLine.exe  /AzureServer:[url_to_azure_server]  AzureDatabase:[databasename] /AzureUserName:[db_owner_username] /AzurePassword:[password] /CreateCopy /StorageAccount:[Azure_account_name] /AccessKey:[primary_or_secondary_azure_storage_key]  /Container:[container_name_in_storage] /Filename:[filename_of_bacpac]

Notice that I did not use any real values in above script. Just fill in the parameter between [ ] and schedule this script to as often you like. Note that you need to change the file name on every run because over writing with a same file name is does not work. I use date+time combinations.

So now we are ready for disasters 😉 Next I will explain how to perform a restore to totally new Azure SQL server.

Restore using Windows Azure management portal

When you need to restore a database to new server you need to have access to bacpac backup file in Azure storage account.

  • Firstly create new Azure SQL server. How to video of that is here.
  • Secondly and this is important add same database logins that the backedup database had. Azure SQL user management is explained in detail here.
  • Thirdly get url of the pacbac file using Azure Storage Explorer. Open Azure Storage Explorer and login to storage account containing the bacpac file. Select the bacpac file and press view button. That is explained in the video of next step.
  • Fourthly you are ready to restore! It can be done like explained in this video.

If you have any comments or questions please don’t hesitate to ask.

Thanks for reading and happy backupping!

I have my app in the Azure. Now what?

So you have your application running on Azure. Some next steps would be beneficial to have. Well maybe, but you need to first figure out answers to a couple of important things.

  • Backup
  • Monitoring
  • Autoscaling

As of now Azure does not provide any solutions of the shelf. You need to build those by yourself. I will describe one possible setup that will provide you some functionality for the points listed above. Being a Cloud Man that I am, this setup is also running entirely on the Cloud 🙂

Setup

Firstly, the setup. In order to get things up and running you will need the following tools and accounts:

  • Amazon EC2 account with Windows 2008 and Amazon monitoring service.
  • Windows 2008 server (with SQL Server 2008 R2 client utilities) for Azure backup.
  • SQL 2008 RC2 server for Azure SQL backup.
  • Azure Management Cmdlets tools for Azure Storage backup.
  • Red-Gate SQL Comparison Bundle version 9 or above for Azure SQL backup.
  • AzureWatch account for Autoscaling and monitoring Azure web and worker roles.
  • (Optionally) Dropbox with Packrat service so you will get unlimited undo history on your backups.

The overall picture of this setup will look something like in the picture below. You could replace Amazon EC2 Windows 2008 server + SQL 2008 RC2 with Azure VM role, or your own hosted server and Dropbox with Azure storage account. By replacing Amazon with Azure WM role you will gain savings on the data transfer fees and the steps should be fairly similar. If you decide to do that I would recommend having those under different Azure subscription so that one Administrator cannot delete your backups and your service!

Azure setup

High level picture of monitored Azure app with backup and autoscale.

In addition, make sure that Windows 2008 Server has at least SQL Server 2008 R2 installed especially bcp.exe version 10.50.1600.1. That is because bcp.exe utility is used to perform database backups and older versions had nasty bug that prevented backup from working.

If you have several services running on one Azure subscription it is useful to direct their logs into one shared storage account. This is because AzureWatch can monitor only one log account per subscription.

Installation instructions

I don’t go to all the details because steps those are quite obvious. If you get confused with my quick instructions, please send me an email and I will add more details into this blog post.

  1. Get Amazon account and launch Windows 2008 server with SQL 2008 RC2 server or Make VM role for Azure.
  2. Install SQL Server 2008 R2 client utilities
  3. Install Dropbox
  4. Install Azure Management Cmdlets
  5. Install Red-Gate SQL Comparison Bundle
  6. Get AzureWatch account and install the control panel to 2008 server.

Scripts

You should also automate running of these PowerShell scripts with Windows 2008 Server Scheduler:

  • Remember that you need to give full path to PowerShell script in Task arguments like this:

-noninteractive -nologo -command “&”c:\my backup scripts\backup.ps1″”

  • Modify SQL backup script from Mike Mooneys blog to suite your needs. Make sure that zip files are stored in folder that is Dropbox synchronized.
  • Modify backup sample scripts of Azure Management Cmdlets to suite your needs. Samples can be found from installation folder. You can direct backup to Azure storage account or to Dropbox folder.

That’s it!

Happy Clouding!
%d bloggers like this: