Random Ramblings About Making Games and Stuff from Cloud

Posts tagged ‘Azure’

Things I learned building OData service to Azure and using WP7 as client

I wanted to build an Windows Phone 7 app called “Stuff I Like”. App would have OData service running in Azure and client that would cache and sync to that service. In previous post I wrote about stuff that I learned building the WP7 app. In this part I will reveal my findings on the service side of things. In the next post I will dwell into Authentication side of the app.

If you are planning to add Federated Authentication using Azure Access Control service then you might want to start by building authentication first. For me it was waaaay easier to add normal aps web page add authentication to that and add WCF data service after login worked. For you I would recommend downloading Azure Training Kit: http://www.microsoft.com/en-us/download/details.aspx?id=8396 and completing Lab Exercise http://msdn.microsoft.com/en-us/identitytrainingcourse_acsandwindowsphone7.aspx

Now back to business.

When I followed this Data service in the cloud http://msdn.microsoft.com/en-us/data/gg192994. I decided to design the data model using visual studio design tool. After playing a round with the tool I managed to “draw” data model and after that it was just a matter of syncing it to Database.

This is how I draw the data model

After syncing above data model into database, setting up the service and running it I quickly found that my WCF Data Services where not working at all. Message “The server encountered an error processing the request. See server logs for more details.” was shown to me quite frequently. Well “the bug” was quite simple to fix and these debugging instructions helped me a lot http://www.bondigeek.com/blog/2010/12/11/debugging-wcf-data-services:

  1. A missing SetEntitySetAccessRule
  2. A missing pluralisation on the SetEntitySetAccessRule
  3. A missing SetServiceOperationAccessRule

After Debugged EntityAccessRules with simple “*” allowRead just to check that I had not made a typo I quickly found out that I indeed had 😦 So I after I fixed a typo in EdmRelationshipAttribute and it caused the exception. After that stupid mistake things started to look better.

If you need more instructions on how to turn on debugging messages then just follow these instructions:

This is how my service looked after I finally got it running

After I managed to get service defined and running I took a second to make OData feed a bit more readable so that you can consume your OData feed using Explorer and other browsers. http://msdn.microsoft.com/en-us/library/ee373839.aspx

This is how OData feed looks before you tweak it a bit.

For some reason I managed to first add “m:FC_TargetPath” and similar properties to wrong xml element. So make sure you scroll down the file and add it to correct place 🙂

This is how OData feed will look when you tweak it a bit

Another thing that took me couple of hours to figure out was that Explorer does not show all OData results in consistent way. So before you start heavily debugging check the returned html page source code and you should see expected result in XML format. OR you could use another browser. For example this call did not seem to return anything http://localhost:57510/WcfDataServiceStuffILike.svc/StuffSet(guid’cf1bfd2f-99f3-4047-99f8-22bc1aad1b99′)/GategorySet until I checked the source code of this page.

So that’s it. Using Visual Studio this was quite easy and I actually spent most of my time figuring out why some configuration did not work than making the code.  This might be due to the fact that I have “unclean” dev environment or I made lots of changes to above demos while I followed them. This was mainly due to the fact that I wanted to build my own app and not simply type in demos and labs.

I bet If you build your dev environment correctly and follow the labs and demos to the letter you won’t see as many problems that I witnessed. But where is the fun in that 😉

Advertisements

Azure SQL backup using Red-Gate Cloud Services

I stumbled upon Red Gate Azure Backup service: https://cloudservices.red-gate.com/

After using the beta for couple of weeks I am really impressed. Everything is easy, UI is intuitive and you don’t need lots of configuration to make this work. You can setup basic daily backup work in just matter of minutes. If you got Azure storage or Amazon S3 account ready. If not then it still does not take long. You spend most of the time digging up usernames and passwords not figuring out how things work! Splendid.

So let’s go through how to do daily Azure SQL backup into Amazon S3 storage.

Red-Gate Cloud Service Dashboard is really clean and simple. Just select the action you want to perform.

First after registering to Red-Gate Cloud Services and login select “Backup SQL Azure to Amazon S3” from Dashboard.

Just fill in the Azure SQL credentials and Amazon S3 keys.

Secondly just fill in Azure SQL server name xxxxxxx..database.windows.net, login credentials to that DB and press refresh button next to Database drop down. Select database that you want to backup. Next click on the “AWS Security Credentials” link and login to your Amazon AWS account. You will be taken directly to place where you can find the keys.

"Access key id" goes to "AWS Access Key" and "Secret Access Key" is shown after pressing "show" link in Amazon AWS.

 Note that you need S3 Bucket. If you don’t have one you can create it from here https://console.aws.amazon.com/s3/home I will not explain how that is done, but it is really easy. After you have inserted Amazon keys you can press the refresh button next to Bucket dropdown in Red-Gate UI. Just fill in the name that you want your backup file to have. Don’t worry about the day stamp because this tool will add date at the end of the filename. Press “continue” button to select scheduling options or backup now.

Just select when you want your daily backup to run or just press "Backup now" button.

 Next just fill in exact time you want your backup in 24h format, select time zone, select week days when to run this and press schedule button. You can also just press backup now button for one time backup. Note that there is also a monthly backup option.

You should see your scheduled backup in next screen. When your first backup is done it appears here under history header.

Next you are taken into schedules and history view. Here you can view details of the upcoming backup operation, cancel it or invocate backup now. Just move your mouse over the upcoming event.

Just move mouse on top of scheduled backup job and you can cancel job or invocate it to happen right now.

When backup job has been successfully completed you will receive an email and a “log” line will appear in “schedules and backup history”.

If job is successful or not it will appear here under history title

 You can view details of executed jobs by moving mouse on top of history line. If you backup job has run into errors you will see an error triangle as well as receive error message via email. Email also contains a direct link to that specific History log line. Neat!

If you backup job has exceptions it has warning icon in history. Details link will contain error message of what went wrong.

 Clicking on the link on the email will direct you to same view as clicking “details” next to completed job.

Here you can read throught what happened while executing backup. If you ran into errors they are also present. Just scroll down from the bar.

If you got your backup run cleanly you should see the created pacbac file in Amazon S3 bucket.

Backup file is safe and sound in Amazon S3

Note that you can just as easily use Azure Storage services or use FTP to upload the backup to you own backup architecture.

There are couple of missing features that I would like to see. For example similar choice than in Red-Gate backup tool to first create copy of database (for transactional safety) and ability to encrypt the backup file before it is uploaded into Amazon or Azure storages. But event without these small features this is still awesome!

Carefree clouding!

Azure SQL Backup and restore scenarios using bacpac export/import

What Azure SQL was missing was a proper supported way to make backups for disaster scenarios. Those scenarios would include loss of control over Azure SQL server or human error causing SQL admin to delete whole Azure SQL server. Great news everyone Azure SQL now has tools to mitigate the impact of these scenarios. SQL Azure Import/Export Service CTP is now available. More details on how this works can be found here.

What do you need?

You need means of scheduling backup, Azure Storage account where to put bacpac files, means to get the exact url of bacpac file and a Azure SQL account to backup.

Azure Account

You need one Azure account with Azure SQL and Storage server of course. 🙂

You might want to have separate backup storage account because, if you cannot access your production account you still have access to your backups. I personally download bacpac backups from Azure to local server.

Bacpac file export tool

In this example I will use Red-gate backup tool. This is because this tool allows you to easily make a database copy of your database before the backup. This will allow you to make transactionally safe backups.

But you can also use DAC SQL Azure Import Export Service Client V 1.2. In this tool you need to make a database copy using for example Cerebrata cmdlets.

Azure Storage account browser

You can use any tool you like. In this example I will use Azure Storage Explorer because it is free 🙂

Windows server

Ideally this would be dedicated windows 2008 server so that you can be sure that it runs smoothly. You can also download exported bacpac files to this server. Just to be safe 🙂

Backup using Red-Gate command line backup tool

Here is a nice how to video how to setup scripts using Red-Gate tool. NOTE that in this video script will make a backup into different database. What we want to do is schedule a bacpac file generation. If you want to test how bacpac file generation works without command line watch this video.

But back to business: your scheduled script should look something like this:

RedGate.SQLAzureBackupCommandLine.exe  /AzureServer:[url_to_azure_server]  AzureDatabase:[databasename] /AzureUserName:[db_owner_username] /AzurePassword:[password] /CreateCopy /StorageAccount:[Azure_account_name] /AccessKey:[primary_or_secondary_azure_storage_key]  /Container:[container_name_in_storage] /Filename:[filename_of_bacpac]

Notice that I did not use any real values in above script. Just fill in the parameter between [ ] and schedule this script to as often you like. Note that you need to change the file name on every run because over writing with a same file name is does not work. I use date+time combinations.

So now we are ready for disasters 😉 Next I will explain how to perform a restore to totally new Azure SQL server.

Restore using Windows Azure management portal

When you need to restore a database to new server you need to have access to bacpac backup file in Azure storage account.

  • Firstly create new Azure SQL server. How to video of that is here.
  • Secondly and this is important add same database logins that the backedup database had. Azure SQL user management is explained in detail here.
  • Thirdly get url of the pacbac file using Azure Storage Explorer. Open Azure Storage Explorer and login to storage account containing the bacpac file. Select the bacpac file and press view button. That is explained in the video of next step.
  • Fourthly you are ready to restore! It can be done like explained in this video.

If you have any comments or questions please don’t hesitate to ask.

Thanks for reading and happy backupping!

Clouding IKEA Style! Design the Price Tag First.

We developers, and an industry, should start to rethink how we start designing our Cloud Services. The customer centric design is old news. SaaS entrepreneur Rainer Stropek, whom I met in Berlin, said wisely: “Do pricing like IKEA! First design the price tag.”

The decision to purchase your service should be almost subconscious for the customer.

While working in my startup Sopima I have learned that there are three reasons for this. I will discuss these findings in TechDays 2011 so please come and listen – you can also ask me questions, for example via Twitter: tweet your questions and comments to me in advance @anttimakkonen and I’ll try to answer you. But now let’s go back to business. Read on how to make the price tag your first priority.

Reason number 1

You will not win your battle for customers by being the nicest and prettiest service of its kind in the minds of the Internet dwellers. Because of Cloud Services like Amazon (IaaS) and Windows Azure (PaaS) there’s more competition on your way and with a speed you are not accustomed to. It is really easy to put up a service once you bypass the initial learning curve of any Cloud provider. Most of the problems you will face when building your service are not technical. Customers are not interested in what technical solutions you have mastered. They want a polished service experience that does the basic stuff extremely well.

Reason number 2

The decision to purchase your service should be almost subconscious for the customer. Customers that are lured into the web page of your service need to understand what the service is and what the cost is, in a short time before they leave the page. Otherwise you just paid for a lead that is not going to transform to a customer.

In software design this means that you need to design the billing of your product in a way that the customer understands what he or she is getting. For example, do not make him pay for upload/download bandwidth but instead the total size of stored files per month. Keep rates simple not complex. You need to work a little extra to calculate what a customer normally needs. Do not force customers do that calculation. In addition, customers do not like surprises in their bills. Lots of SaaS entrepreneurs prefer to sell prepaid credits or similar to get commitment, and some money fast. In my experience it is easier to purchase “packets”, but don’t overdo the number of options. Calculating costs and prices takes us further to the next reason.

Reason number 3

Computing resources and scalability are no longer major investment decisions. You just order what computing resources you need and you can easily cancel that order when you are done with it. Basically you can run as bad code in the cloud as you wallet can stand and your service will still scale. 🙂

In the age of Cloud Computing you need to direct your attention to what cloud service provider is charging you and minimize that cost. After you have made your service stateless, the cost is the only limitation to scale your service up. You need to optimize and monitor your cost structure – and sometimes even make ‘strange’ design decisions, if that helps you to pull your Cloud costs down. This is even more important if your service revenue streams are dispersed and the profit margins are lean.

As a final thought I’d like to say

Consider billing as an integral part of your service. Do not make customer reconsider his purchase on every obscure bill you send. Especially if you are making a subscription service, do not implement your own billing! Trust the professionals and consider using some of the readymade services like these:

PS. I found these tools that will help you on estimating, monitoring and minimizing Azure costs.

I have my app in the Azure. Now what?

So you have your application running on Azure. Some next steps would be beneficial to have. Well maybe, but you need to first figure out answers to a couple of important things.

  • Backup
  • Monitoring
  • Autoscaling

As of now Azure does not provide any solutions of the shelf. You need to build those by yourself. I will describe one possible setup that will provide you some functionality for the points listed above. Being a Cloud Man that I am, this setup is also running entirely on the Cloud 🙂

Setup

Firstly, the setup. In order to get things up and running you will need the following tools and accounts:

  • Amazon EC2 account with Windows 2008 and Amazon monitoring service.
  • Windows 2008 server (with SQL Server 2008 R2 client utilities) for Azure backup.
  • SQL 2008 RC2 server for Azure SQL backup.
  • Azure Management Cmdlets tools for Azure Storage backup.
  • Red-Gate SQL Comparison Bundle version 9 or above for Azure SQL backup.
  • AzureWatch account for Autoscaling and monitoring Azure web and worker roles.
  • (Optionally) Dropbox with Packrat service so you will get unlimited undo history on your backups.

The overall picture of this setup will look something like in the picture below. You could replace Amazon EC2 Windows 2008 server + SQL 2008 RC2 with Azure VM role, or your own hosted server and Dropbox with Azure storage account. By replacing Amazon with Azure WM role you will gain savings on the data transfer fees and the steps should be fairly similar. If you decide to do that I would recommend having those under different Azure subscription so that one Administrator cannot delete your backups and your service!

Azure setup

High level picture of monitored Azure app with backup and autoscale.

In addition, make sure that Windows 2008 Server has at least SQL Server 2008 R2 installed especially bcp.exe version 10.50.1600.1. That is because bcp.exe utility is used to perform database backups and older versions had nasty bug that prevented backup from working.

If you have several services running on one Azure subscription it is useful to direct their logs into one shared storage account. This is because AzureWatch can monitor only one log account per subscription.

Installation instructions

I don’t go to all the details because steps those are quite obvious. If you get confused with my quick instructions, please send me an email and I will add more details into this blog post.

  1. Get Amazon account and launch Windows 2008 server with SQL 2008 RC2 server or Make VM role for Azure.
  2. Install SQL Server 2008 R2 client utilities
  3. Install Dropbox
  4. Install Azure Management Cmdlets
  5. Install Red-Gate SQL Comparison Bundle
  6. Get AzureWatch account and install the control panel to 2008 server.

Scripts

You should also automate running of these PowerShell scripts with Windows 2008 Server Scheduler:

  • Remember that you need to give full path to PowerShell script in Task arguments like this:

-noninteractive -nologo -command “&”c:\my backup scripts\backup.ps1″”

  • Modify SQL backup script from Mike Mooneys blog to suite your needs. Make sure that zip files are stored in folder that is Dropbox synchronized.
  • Modify backup sample scripts of Azure Management Cmdlets to suite your needs. Samples can be found from installation folder. You can direct backup to Azure storage account or to Dropbox folder.

That’s it!

Happy Clouding!

Can You Trust Your Own Servers?

Yes, it's ugly, but it's my own!

I wrote this post originally to my company blog.

Can I trust Cloud Services? A very common question nowadays. In my opinion the question in the headline is as relevant. Very often cloud services are seen as only a potential risk, and the benefits are forgotten. The company data is kept tightly in in-premise servers, with a perfect control. A common thought goes like this: Cloud is a dangerous place and my own servers are safe, of course. Wrong. Your own server is your own server. Cloud is cloud. Let me explain.

Let me compare these two ways in the contract management context, on-premise servers and practices to cloud offerings. Is your organization sending contract drafts and contracts via unencrypted e-mail to your business partners? How are your contracts protected, both physically and technically? Who can see the contents of agreements? I dare to claim that the current cloud services solve most of these problems.

TOP3 Cloud Service Myths

Myth 1. On the Internet there is always someone attacking the Cloud. Therefore, the cloud is a threat.

Maybe. But I’d like to ask you if your servers are connected to the Internet? If so, welcome to the club. I hope you have done something about it. Additionally, I would like to say that the firewall is not a sufficient answer to this. If the server is not online, so what the heck it’s worth for ‘in the closet’? The organization must be able to utilize the stored information, as the Gigabytes will not bring any benefits to you, only the utilization of it will. One more thing: when using the Cloud Services, the administrators are monitoring the traffic and continuously checking out the logs, in order to find alarming signs. Who is monitoring your log?

Myth 2. When using Cloud Services, someone else may have access to my data. Therefore, the Cloud is dangerous.

As if hiding the in the corner of your server room would be safe. Wrong. The fact is that if you hide your wallet in your backyard it is not as safe as it is to put the safe in a bank. The expired user access combined with shared user ID’s increase the number of people who see your information if they wish to. Can you be sure about the accuracy of the user rights and access in your organization? When it comes to Cloud Services there is an automatic check point for this every month when the invoice comes.

Proper user rights management together with Cloud Services brings a better physical security: who is responsible for the costs if someone steals your servers? Naturally it is a good idea to check whether the servers of your Cloud Host are really safe. On your way to find answers to this question, check out Pasi Mäkinen’s article in Tietoviikko (in Finnish).

Myth 3. When the Internet is down, the Cloud Services may be down too, for hours. Therefore, the Cloud is not reliable.

If the Internet is down, are your own services still available to your customers, and to you? When using Cloud Services it is not likely that you get any compensation for that lost time, but do you if you have everything on your own servers? Most probably not, and on top of that, a true trade-off, is that someone (or many persons) in your organization is forced to stop the productive work and start to solve the IT service problems and correcting the situation. Costs a lot.

Don't hide your head in sand. Look up to the clouds.

My point, you just cannot say that your own server is more secure. Way too simple.

In case I’m proven wrong, you can truly be very proud, having such a well-managed environment with a reasonable cost. On the other hand I’d like to ask you if this is the key task that adds value to your business. What if you used a part of the time and energy you use for the internal data security efforts to a development of a new business idea?

Finally, a word of warning. The Cloud is not the safest environment in the world, but I would argue that it is much safer than most of the internally tuned Extranets are.

I’d like to challenge you to investigate the Online Security promise of Microsoft Azure and compare it to your data security practices. You might be surprised. And I cannot promise that it will be a positive surprise.

%d bloggers like this: