Random Ramblings About Making Games and Stuff from Cloud

Archive for the ‘Azure’ Category

Things I learned building OData service to Azure and using WP7 as client

I wanted to build an Windows Phone 7 app called “Stuff I Like”. App would have OData service running in Azure and client that would cache and sync to that service. In previous post I wrote about stuff that I learned building the WP7 app. In this part I will reveal my findings on the service side of things. In the next post I will dwell into Authentication side of the app.

If you are planning to add Federated Authentication using Azure Access Control service then you might want to start by building authentication first. For me it was waaaay easier to add normal aps web page add authentication to that and add WCF data service after login worked. For you I would recommend downloading Azure Training Kit: http://www.microsoft.com/en-us/download/details.aspx?id=8396 and completing Lab Exercise http://msdn.microsoft.com/en-us/identitytrainingcourse_acsandwindowsphone7.aspx

Now back to business.

When I followed this Data service in the cloud http://msdn.microsoft.com/en-us/data/gg192994. I decided to design the data model using visual studio design tool. After playing a round with the tool I managed to “draw” data model and after that it was just a matter of syncing it to Database.

This is how I draw the data model

After syncing above data model into database, setting up the service and running it I quickly found that my WCF Data Services where not working at all. Message “The server encountered an error processing the request. See server logs for more details.” was shown to me quite frequently. Well “the bug” was quite simple to fix and these debugging instructions helped me a lot http://www.bondigeek.com/blog/2010/12/11/debugging-wcf-data-services:

  1. A missing SetEntitySetAccessRule
  2. A missing pluralisation on the SetEntitySetAccessRule
  3. A missing SetServiceOperationAccessRule

After Debugged EntityAccessRules with simple “*” allowRead just to check that I had not made a typo I quickly found out that I indeed had 😦 So I after I fixed a typo in EdmRelationshipAttribute and it caused the exception. After that stupid mistake things started to look better.

If you need more instructions on how to turn on debugging messages then just follow these instructions:

This is how my service looked after I finally got it running

After I managed to get service defined and running I took a second to make OData feed a bit more readable so that you can consume your OData feed using Explorer and other browsers. http://msdn.microsoft.com/en-us/library/ee373839.aspx

This is how OData feed looks before you tweak it a bit.

For some reason I managed to first add “m:FC_TargetPath” and similar properties to wrong xml element. So make sure you scroll down the file and add it to correct place 🙂

This is how OData feed will look when you tweak it a bit

Another thing that took me couple of hours to figure out was that Explorer does not show all OData results in consistent way. So before you start heavily debugging check the returned html page source code and you should see expected result in XML format. OR you could use another browser. For example this call did not seem to return anything http://localhost:57510/WcfDataServiceStuffILike.svc/StuffSet(guid’cf1bfd2f-99f3-4047-99f8-22bc1aad1b99′)/GategorySet until I checked the source code of this page.

So that’s it. Using Visual Studio this was quite easy and I actually spent most of my time figuring out why some configuration did not work than making the code.  This might be due to the fact that I have “unclean” dev environment or I made lots of changes to above demos while I followed them. This was mainly due to the fact that I wanted to build my own app and not simply type in demos and labs.

I bet If you build your dev environment correctly and follow the labs and demos to the letter you won’t see as many problems that I witnessed. But where is the fun in that 😉

Advertisements

Azure SQL backup using Red-Gate Cloud Services

I stumbled upon Red Gate Azure Backup service: https://cloudservices.red-gate.com/

After using the beta for couple of weeks I am really impressed. Everything is easy, UI is intuitive and you don’t need lots of configuration to make this work. You can setup basic daily backup work in just matter of minutes. If you got Azure storage or Amazon S3 account ready. If not then it still does not take long. You spend most of the time digging up usernames and passwords not figuring out how things work! Splendid.

So let’s go through how to do daily Azure SQL backup into Amazon S3 storage.

Red-Gate Cloud Service Dashboard is really clean and simple. Just select the action you want to perform.

First after registering to Red-Gate Cloud Services and login select “Backup SQL Azure to Amazon S3” from Dashboard.

Just fill in the Azure SQL credentials and Amazon S3 keys.

Secondly just fill in Azure SQL server name xxxxxxx..database.windows.net, login credentials to that DB and press refresh button next to Database drop down. Select database that you want to backup. Next click on the “AWS Security Credentials” link and login to your Amazon AWS account. You will be taken directly to place where you can find the keys.

"Access key id" goes to "AWS Access Key" and "Secret Access Key" is shown after pressing "show" link in Amazon AWS.

 Note that you need S3 Bucket. If you don’t have one you can create it from here https://console.aws.amazon.com/s3/home I will not explain how that is done, but it is really easy. After you have inserted Amazon keys you can press the refresh button next to Bucket dropdown in Red-Gate UI. Just fill in the name that you want your backup file to have. Don’t worry about the day stamp because this tool will add date at the end of the filename. Press “continue” button to select scheduling options or backup now.

Just select when you want your daily backup to run or just press "Backup now" button.

 Next just fill in exact time you want your backup in 24h format, select time zone, select week days when to run this and press schedule button. You can also just press backup now button for one time backup. Note that there is also a monthly backup option.

You should see your scheduled backup in next screen. When your first backup is done it appears here under history header.

Next you are taken into schedules and history view. Here you can view details of the upcoming backup operation, cancel it or invocate backup now. Just move your mouse over the upcoming event.

Just move mouse on top of scheduled backup job and you can cancel job or invocate it to happen right now.

When backup job has been successfully completed you will receive an email and a “log” line will appear in “schedules and backup history”.

If job is successful or not it will appear here under history title

 You can view details of executed jobs by moving mouse on top of history line. If you backup job has run into errors you will see an error triangle as well as receive error message via email. Email also contains a direct link to that specific History log line. Neat!

If you backup job has exceptions it has warning icon in history. Details link will contain error message of what went wrong.

 Clicking on the link on the email will direct you to same view as clicking “details” next to completed job.

Here you can read throught what happened while executing backup. If you ran into errors they are also present. Just scroll down from the bar.

If you got your backup run cleanly you should see the created pacbac file in Amazon S3 bucket.

Backup file is safe and sound in Amazon S3

Note that you can just as easily use Azure Storage services or use FTP to upload the backup to you own backup architecture.

There are couple of missing features that I would like to see. For example similar choice than in Red-Gate backup tool to first create copy of database (for transactional safety) and ability to encrypt the backup file before it is uploaded into Amazon or Azure storages. But event without these small features this is still awesome!

Carefree clouding!

SQL Azure monitoring with Cotega

I just started to use this SQL Azure and SQL Server Monitoring tool http://www.cotega.com/ and I must say that it is exactly what I was looking for in a easy to use monitoring tool. I don’t need to be complex, feature rich and massive monitoring tool. I only need a alert when database is not accessible, if performance seems to be below average or there is a spike in the usage. Service is stil in beta but it looks really promising.

Just click "add SQL Azure database"

Setting up the database connection was easy. After login you are presented with Dashboard view. Just click “add SQL Azure database and fill in the database address, username and password to the popup. Finally press “Add database button”.

Fill in the address and login details.

You can add notifications to you databases by going to Notifications view and pressing “Create Notification” button.

To add new notification just press "Create Notification"

Just fill the name of the notification, select monitoring rule from “select what you want to monitor”, database and the polling frequency.

Fill in the details and select monitoring target.

After you have selected monitoring target you can select when to create a notification. In this case I have selected “when connection fails”. After this you can fill in a email address and even select a stored procedure to be run. But with connection fails case that does not make sense :). Finally press “Add Notification”.

Add email address and select stored procedure if you need one

There you go. Just start waiting those notifications to kick in.
There is a nice feature in the notifications: When you start to receive lots of them for example conserning “connection fails” you can temporarily disable the notification directly from the notification.

When you get many same type of Notifications you have option to disable the Notification directly from the email.

You can check the logs for specific notification to make sure that rule works.

You can see how rules are being evaluated.

Also you can view this info in a report view for performance ananlysis purposes.

Its easier to spot performance problems from visualized reports.

Very potential neat litle monitoring tool for those who don’t need complex solution for simple problem!

Azure SQL Backup and restore scenarios using bacpac export/import

What Azure SQL was missing was a proper supported way to make backups for disaster scenarios. Those scenarios would include loss of control over Azure SQL server or human error causing SQL admin to delete whole Azure SQL server. Great news everyone Azure SQL now has tools to mitigate the impact of these scenarios. SQL Azure Import/Export Service CTP is now available. More details on how this works can be found here.

What do you need?

You need means of scheduling backup, Azure Storage account where to put bacpac files, means to get the exact url of bacpac file and a Azure SQL account to backup.

Azure Account

You need one Azure account with Azure SQL and Storage server of course. 🙂

You might want to have separate backup storage account because, if you cannot access your production account you still have access to your backups. I personally download bacpac backups from Azure to local server.

Bacpac file export tool

In this example I will use Red-gate backup tool. This is because this tool allows you to easily make a database copy of your database before the backup. This will allow you to make transactionally safe backups.

But you can also use DAC SQL Azure Import Export Service Client V 1.2. In this tool you need to make a database copy using for example Cerebrata cmdlets.

Azure Storage account browser

You can use any tool you like. In this example I will use Azure Storage Explorer because it is free 🙂

Windows server

Ideally this would be dedicated windows 2008 server so that you can be sure that it runs smoothly. You can also download exported bacpac files to this server. Just to be safe 🙂

Backup using Red-Gate command line backup tool

Here is a nice how to video how to setup scripts using Red-Gate tool. NOTE that in this video script will make a backup into different database. What we want to do is schedule a bacpac file generation. If you want to test how bacpac file generation works without command line watch this video.

But back to business: your scheduled script should look something like this:

RedGate.SQLAzureBackupCommandLine.exe  /AzureServer:[url_to_azure_server]  AzureDatabase:[databasename] /AzureUserName:[db_owner_username] /AzurePassword:[password] /CreateCopy /StorageAccount:[Azure_account_name] /AccessKey:[primary_or_secondary_azure_storage_key]  /Container:[container_name_in_storage] /Filename:[filename_of_bacpac]

Notice that I did not use any real values in above script. Just fill in the parameter between [ ] and schedule this script to as often you like. Note that you need to change the file name on every run because over writing with a same file name is does not work. I use date+time combinations.

So now we are ready for disasters 😉 Next I will explain how to perform a restore to totally new Azure SQL server.

Restore using Windows Azure management portal

When you need to restore a database to new server you need to have access to bacpac backup file in Azure storage account.

  • Firstly create new Azure SQL server. How to video of that is here.
  • Secondly and this is important add same database logins that the backedup database had. Azure SQL user management is explained in detail here.
  • Thirdly get url of the pacbac file using Azure Storage Explorer. Open Azure Storage Explorer and login to storage account containing the bacpac file. Select the bacpac file and press view button. That is explained in the video of next step.
  • Fourthly you are ready to restore! It can be done like explained in this video.

If you have any comments or questions please don’t hesitate to ask.

Thanks for reading and happy backupping!

Why SLA does not make sense in the Cloud?

What do you really want to accomplish with Service Level Agreement (SLA)? To punish or to get the best support available, as soon as possible? With the traditional on premise software if there is a problem you are pretty much all alone with it.The time and the money between binary hitting the fan and fan being fixed is solely coming from your pocket. In the traditional on-premise or dedicated server software environment SLA makes sense. You need to have some leverage and certainty that your software provider is at least mildly interested in fixing your problem.

When “Cloud Computing” is hit with speed bumps the whole Internet is holding its breath. Latest example being Amazon incident on April 21 2011. If SaaS service is not up and running SaaS firm is losing lot of money and very fast. Most of the SaaS firms have monthly recurring revenue model and customers can cancel subscription within one month notice. This means that customers can vote with their wallet. So you can be sure that a SaaS firm gives its fullest attention in order to get their SaaS service up and running as soon as possible. If your provider has fully Clouded its technology (so-called multitenancy), your application instance will be fixed as soon as service is fixed. No one is getting special treatment. Not good or bad. So there is no need to be fearful about your account is not running while others are.

Storm, calm or no cloud? You have no idea.

You should be able to see if your Cloud is in a storm or calm.

With a proper SLA the damages and indemnification are somehow fixed to the amount of money you pay for the software, services included. With SaaS firms your monthly – and even yearly fees – are quite small amount of money and so are the potential liquidated damages. This means that, for example, 10% indemnification of the subscription value is merely a nominal sum. For example, if a SaaS service would cost you 59$ yearly subscription it would entitle you 50 cent compensation for one month downtime. And before you try to negotiate higher indemnification, talk to you own lawyer and ask if you should sign a contract that has liquidated damages clause over 100% of contract value. Next imagine you are asking for it from SaaS firm? And guess the response. My point is that there is no realistic way to imagine a SLA between you and SaaS provider that would have real monetary indemnification. The real penalty for a SaaS provider is in the form of loss of income, increased churn, and negative publicity. Any serious SaaS procvider will do everything to avoid this.

All is well in the cloud

Seek for transparency instead of indemnification.

What I am trying to say is that instead of asking what SLA levels you receive and what kind of compensation is possible, ask what your SaaS provider is doing to minimize the downtime. It does not make sense to try to get SLA agreement as tight as possible. It makes more sense to make sure that provider is “all in” with the cloud. If service you are using truly has significant monetary value for you provider it will make sure that it will run as smoothly as humanly possible.

Make them prove that they know what they are doing, but not with a SLA. Ask about security, availability, recovery and how you can monitor uptime. For example Azure (Service Dashboard) , AzureWatch and Sopima Oy provide RSS feeds to its customers to give transparency of its service levels.

Focus on finding the signs of preemption, transparency and security – instead of indemnification in the SLA.

PS. If you want to know that 10 questions you should ask from your SaaS vendor and what are the correct answers go here.

Clouding IKEA Style! Design the Price Tag First.

We developers, and an industry, should start to rethink how we start designing our Cloud Services. The customer centric design is old news. SaaS entrepreneur Rainer Stropek, whom I met in Berlin, said wisely: “Do pricing like IKEA! First design the price tag.”

The decision to purchase your service should be almost subconscious for the customer.

While working in my startup Sopima I have learned that there are three reasons for this. I will discuss these findings in TechDays 2011 so please come and listen – you can also ask me questions, for example via Twitter: tweet your questions and comments to me in advance @anttimakkonen and I’ll try to answer you. But now let’s go back to business. Read on how to make the price tag your first priority.

Reason number 1

You will not win your battle for customers by being the nicest and prettiest service of its kind in the minds of the Internet dwellers. Because of Cloud Services like Amazon (IaaS) and Windows Azure (PaaS) there’s more competition on your way and with a speed you are not accustomed to. It is really easy to put up a service once you bypass the initial learning curve of any Cloud provider. Most of the problems you will face when building your service are not technical. Customers are not interested in what technical solutions you have mastered. They want a polished service experience that does the basic stuff extremely well.

Reason number 2

The decision to purchase your service should be almost subconscious for the customer. Customers that are lured into the web page of your service need to understand what the service is and what the cost is, in a short time before they leave the page. Otherwise you just paid for a lead that is not going to transform to a customer.

In software design this means that you need to design the billing of your product in a way that the customer understands what he or she is getting. For example, do not make him pay for upload/download bandwidth but instead the total size of stored files per month. Keep rates simple not complex. You need to work a little extra to calculate what a customer normally needs. Do not force customers do that calculation. In addition, customers do not like surprises in their bills. Lots of SaaS entrepreneurs prefer to sell prepaid credits or similar to get commitment, and some money fast. In my experience it is easier to purchase “packets”, but don’t overdo the number of options. Calculating costs and prices takes us further to the next reason.

Reason number 3

Computing resources and scalability are no longer major investment decisions. You just order what computing resources you need and you can easily cancel that order when you are done with it. Basically you can run as bad code in the cloud as you wallet can stand and your service will still scale. 🙂

In the age of Cloud Computing you need to direct your attention to what cloud service provider is charging you and minimize that cost. After you have made your service stateless, the cost is the only limitation to scale your service up. You need to optimize and monitor your cost structure – and sometimes even make ‘strange’ design decisions, if that helps you to pull your Cloud costs down. This is even more important if your service revenue streams are dispersed and the profit margins are lean.

As a final thought I’d like to say

Consider billing as an integral part of your service. Do not make customer reconsider his purchase on every obscure bill you send. Especially if you are making a subscription service, do not implement your own billing! Trust the professionals and consider using some of the readymade services like these:

PS. I found these tools that will help you on estimating, monitoring and minimizing Azure costs.

I have my app in the Azure. Now what?

So you have your application running on Azure. Some next steps would be beneficial to have. Well maybe, but you need to first figure out answers to a couple of important things.

  • Backup
  • Monitoring
  • Autoscaling

As of now Azure does not provide any solutions of the shelf. You need to build those by yourself. I will describe one possible setup that will provide you some functionality for the points listed above. Being a Cloud Man that I am, this setup is also running entirely on the Cloud 🙂

Setup

Firstly, the setup. In order to get things up and running you will need the following tools and accounts:

  • Amazon EC2 account with Windows 2008 and Amazon monitoring service.
  • Windows 2008 server (with SQL Server 2008 R2 client utilities) for Azure backup.
  • SQL 2008 RC2 server for Azure SQL backup.
  • Azure Management Cmdlets tools for Azure Storage backup.
  • Red-Gate SQL Comparison Bundle version 9 or above for Azure SQL backup.
  • AzureWatch account for Autoscaling and monitoring Azure web and worker roles.
  • (Optionally) Dropbox with Packrat service so you will get unlimited undo history on your backups.

The overall picture of this setup will look something like in the picture below. You could replace Amazon EC2 Windows 2008 server + SQL 2008 RC2 with Azure VM role, or your own hosted server and Dropbox with Azure storage account. By replacing Amazon with Azure WM role you will gain savings on the data transfer fees and the steps should be fairly similar. If you decide to do that I would recommend having those under different Azure subscription so that one Administrator cannot delete your backups and your service!

Azure setup

High level picture of monitored Azure app with backup and autoscale.

In addition, make sure that Windows 2008 Server has at least SQL Server 2008 R2 installed especially bcp.exe version 10.50.1600.1. That is because bcp.exe utility is used to perform database backups and older versions had nasty bug that prevented backup from working.

If you have several services running on one Azure subscription it is useful to direct their logs into one shared storage account. This is because AzureWatch can monitor only one log account per subscription.

Installation instructions

I don’t go to all the details because steps those are quite obvious. If you get confused with my quick instructions, please send me an email and I will add more details into this blog post.

  1. Get Amazon account and launch Windows 2008 server with SQL 2008 RC2 server or Make VM role for Azure.
  2. Install SQL Server 2008 R2 client utilities
  3. Install Dropbox
  4. Install Azure Management Cmdlets
  5. Install Red-Gate SQL Comparison Bundle
  6. Get AzureWatch account and install the control panel to 2008 server.

Scripts

You should also automate running of these PowerShell scripts with Windows 2008 Server Scheduler:

  • Remember that you need to give full path to PowerShell script in Task arguments like this:

-noninteractive -nologo -command “&”c:\my backup scripts\backup.ps1″”

  • Modify SQL backup script from Mike Mooneys blog to suite your needs. Make sure that zip files are stored in folder that is Dropbox synchronized.
  • Modify backup sample scripts of Azure Management Cmdlets to suite your needs. Samples can be found from installation folder. You can direct backup to Azure storage account or to Dropbox folder.

That’s it!

Happy Clouding!
%d bloggers like this: