Random Ramblings About Making Games and Stuff from Cloud

We producers tend trust processes and forget that trust, communication, skill and determination is a huge thing. We tend to rely on Scrum, Kanpan or similar over persons skill and expertise. I know I did. I truly believe that we try to learn from the past projects and even postmortems written by kind folk in Gamasutra, but we usually miss the true reasons why we fail in a game dev projects.

Making software is hard. Making games is damn near impossible. We keep making mistakes and we try to find reasons and that is what we are supposed to do. The things where we usually make mistakes are not the ones that following a processes will solve. Problem might be that we really don’t know how to make a fun game. We know how to make games and we want to believe that we know what a is a fun game, but ultimately it is the players that will tell us how we did. When we fail we tend to take a good look on what went wrong and make adjustments on the processes. While reflecting the past is a good idea we tend to dismiss the persons behind the process. Usually its not that mistake was made because we did not have full control over what happened. More often reason is that direction, goals or communication was not clear. In some cases we just did not have the skills or knowledge. I don’t think that these situations can be solved with adding more and more control by fixing processes.

In my opinion processes should not be treated like silver bullets that work all the time with different companies, projects and team members. No process is a substitute for skill, motivation, communication and ownership. If the team and its members don’t want or know how to or are not allowed to make a fun game I believe it can’t be done with any processes. Using strict processes might tell you that project is not going according to plan, but it will not launch a game. I challenge you to find what works best with your team and with each member of the team and heavily modify the ways you currently work if needed. It will not be easy but as I found it will pay off in the end. The process should give all the people in the team clear goals, improve communication and build trust. In most of companies I used to work we used processes to get control over team, games design direction, get visibility on how well team did their work and usually people actually working in the team did not find them that useful.

I was recently fortunate enough to work with a dream team and it opened my eyes. We managed to launch a top quality game in nine months with next to no pre-production. During that time I had one of the easiest projects in my producer experience. It was odd since we had a hard launch with fixed deadline and a brand that demanded excellence on quality and fun factor. Reason why it was easy was that I was able to truly trust that team was doing their best. Even when the game’s design direction was fluctuating quite a bit, I managed to remain calm because I knew that the team believed it was the best thing to do. I could fully focus on producing: foreseeing possible problems, getting help to the team when needed, reminding the team about goals and schedule, communicate what are we doing and why to the rest of the company and keeping the troops fed, happy and eyes on the big picture.

After our nine months game project I believe I have a hunch on how game projects should be directed and how highly productive team should be built. It is essential to trust the team and that the team is trusted by the company. We should let the professionals to do their work and make the decisions even if you don’t fully agree on everything. That is why you (or someone else) hired them in the first place. When the team works well together, has the skills, motivation and clear direction where to take the game to, then you don’t need heavy processes just communication and shared goals. Just agree the goals with the team, decide who does what and go for it. Follow up on the goals on high level by looking at the daily build. Changing the goals should be as easy as identifying that current ones are no longer relevant and agreeing with the new ones.

How we worked as a team

Word of warning before you read on. Our team had been working for a while together and we knew how we work together. The tools we used and the game genre was familiar to us and team members were highly motivated and skillful professionals. So we knew what we were doing. So don’t apply following ways-of-work without considering if they can work for your team.

Extra warning for the not so indies out there. We really had a creative freedom, peace to operate and we were allowed to make decisions inside the team. The following things will not work if you don’t have the power and freedom to make decisions that your team believes are best for the game.

What worked in our team was one week short sprints with goals like “character abilities for playable demo” and “players are able to purchase gold from shop”. In start of the sprint we had a meeting with leads and we agreed with the goals. Next developers had their own meeting where they agreed on who does what and how to get there. One developer was reserved for designer to code game-play stuff as soon as possible without heavy game design documentation. We had two daily meetings where we showed to the team what we did and what we are planning to do next. Only record of things to do was a whiteboard with huge calendar to remind us how much time we had left. Whiteboard was redesigned couple of times in the project to better match the current state of project. Only thing that remained stable was printed A4 papers labeled Important in top of the white board and not Important in the bottom of the white board. Targets, goals and tasks where put on top and not important ones to lower part of the white board. Usually not important ones were removed from the wall without actually even implementing them at all. And that was the whole process it looks and smells a lot like scrum, but, with two main differences. One the process was designed to give as much information to the team keeping the focus on the end results and not the tasks. Second differences is that we did not measure or monitor how long tasks did to complete. We only looked what was in the game build in each day, at times by each dev build per hour.

What comes to working with the people in the team it varied. One developer preferred to work with a prioritized list so we made it for his tasks. One developer preferred to take one huge goal like “payment system” or “event system” and work with that until it was completed. One developer liked to prototype so we paired him with the designer and they talked together on what would be the next thing to try out. One artist liked to work on the scenes and another wanted to have list of game assets to work with and so on. What I am saying is that only thing that was uniform on how we tracked the progress was two times per day we showed what we had done and we talked a lot to each other and we trusted each others opinions and decisions.

Final words

So anyway this was what I learned in the last nine months on what is important in game development and how highly productive team can be build. Hopefully you find something useful from my ramblings. Start trusting people and see where that takes you.

Thanks for reading and hopefully you got something out of this post.

Also published in Gamasutra as featured blog post.

Advertisements

Tools for decision making

One big problem in game development projects is that we are in constant doubts about the game we are working on. Do we need another game mode? Should we change graphics because they seem too dark? Is this fun? Should we add this new feature? Should we even be making this game in the first place? Is our game good enough because our competitors have much better graphics? We are constantly making decisions based on hunch, fear and panic. To make things worse we are constantly end up revisiting the same problems and decisions over and over again.

I personally believe in the mantra “If your only tool is hammer then every problem looks like a nail”. When you want to stick two pieces of lumber together there are nails, screws and glue just to mention a few that pop into my mind. Same applies when you are making a decision. When it comes to making decisions, there are tools that will help you to make them and have confidence to stand behind them. I don’t have a magical knowledge on when to apply a certain tool on certain problem so it would always result in best decision available. When it comes to making decisions in various game development stages reality is that every game and people making the decisions are different.

So what this blog post means to you and me? Hopefully you will find some new tools or at least new perspectives on how decision-making can be less paralyzing and how to make sanity checks on your decisions. For me the best thing that might come from this blog post is your comments. I want to hear what tools you have in your arsenal and what new perspectives I could learn from you.

First tool that I use in these situations is to ask from someone else who is not working on the game. My animator friend described one of the problem of animating a movie as follows:  “When you are working with animation movie it takes a time. Animating a movie takes so much time that you must have a really good reason to make it into animation in the first place. It would be so much faster to just shoot it into video using live actors or draw a comic book. There will be a time when you will start to hate your story and you want to make changes. This is just because you have worked with the project for so long. When you find your self wanting to change things in mid animation project, the first thing you should do is to ask someone else’s opinion”. In a nutshell this means that you will be fed up by you game project before you have finished it. Sometimes these changes can enhance the game, but sometimes you will want to change things just to amuse yourself. So before you start making changes ask for external opinion. Help can be found from fans, a colleague or a family member just to name a few. As long as those people have not been involved with your project as heavily as you they might have a better understanding of what is new and fun in your game. Of course you should not take their word as an absolute truth because in game development there are a huge amount of uncertainty until the game is polished and it is in the hands of the players.

Second tool is to always remember why you are making the game on and steer the projects towards that goal. One of the most useful piece of knowledge was passed to me just a few weeks ago. “You should always be aware why you are doing something. After that, have an idea how to measure your success when your game is live (or in alpha testing). If everybody in the team understands reasons why this game is being made, they will more likely make the right decisions.” So you should always be aware why you are making a game or an update and everybody in the project team should know those reasons. Reasons may vary from personal ambitions “I want to make this kind of a game” to more business oriented “we need to increase our daily active user count”. Regardless of the origin or the reason it self, it needs to be stated aloud and measured accordingly. If you do that, then it is easier to make decisions. It is always possible to revisit the reasons why you are making something and when you do change the reason don’t be afraid to redesign the whole game or cancel the project.

Third tool in my arsenal is measure and analyze your success and learn from your mistakes. Following piece of helpful encouragement was handed to me by my boss once “You are going to make mistakes. A lot of them. And if you don’t follow the impact of your desired targets, there is high change that you will never learn from your decisions (and how to make them)”. Be prepared to measure your success in some measurable way and learn from your mistakes. If increasing daily active user is your aim, measure the effect post launch, analyze your results and learn from them. If you where making a totally new kind of a game, then check from reviews and comments if the press and gamers really noticed that this is something totally new. You should also try to learn how you and your team make decisions. Keep decision log where you list decisions, dates, people involved and reasons why that decision was made. Use that list when you are facing a situation where you might want to change direction or revert a decision you already made. Also make decision log review a part of your postmortem process and try to learn from it. If you are lucky you even might learn from mistakes of others. Read lots of postmortems and ask how other people make decisions in your workplace or startup sparring circle.

Fourth tool would be fail fast and don’t be afraid to make mistakes. My boss and numerous excellent blog post in Gamasutra (1, 2) already recognize this. You are going to make mistakes and a lot of them. Don’t be afraid to try something out. You should also be equally willing to confess it to your self when the thing you tried out does not work. Don’t force something in to a game just because you made a decision about it. Do, review, iterate and abandon ideas in fast cycles.

Well that was my two cents on how decision-making could be easier and how you and your team decision making could be more efficient. I would love to hear your experiences, comments and ideas. I am always looking opportunities to learn from the best.

Also posted in Gamasutra as a featured blog post.

I wanted to build an Windows Phone 7 app called “Stuff I Like”. App would have OData service running in Azure and client that would cache and sync to that service. In previous post I wrote about stuff that I learned building the WP7 app. In this part I will reveal my findings on the service side of things. In the next post I will dwell into Authentication side of the app.

If you are planning to add Federated Authentication using Azure Access Control service then you might want to start by building authentication first. For me it was waaaay easier to add normal aps web page add authentication to that and add WCF data service after login worked. For you I would recommend downloading Azure Training Kit: http://www.microsoft.com/en-us/download/details.aspx?id=8396 and completing Lab Exercise http://msdn.microsoft.com/en-us/identitytrainingcourse_acsandwindowsphone7.aspx

Now back to business.

When I followed this Data service in the cloud http://msdn.microsoft.com/en-us/data/gg192994. I decided to design the data model using visual studio design tool. After playing a round with the tool I managed to “draw” data model and after that it was just a matter of syncing it to Database.

This is how I draw the data model

After syncing above data model into database, setting up the service and running it I quickly found that my WCF Data Services where not working at all. Message “The server encountered an error processing the request. See server logs for more details.” was shown to me quite frequently. Well “the bug” was quite simple to fix and these debugging instructions helped me a lot http://www.bondigeek.com/blog/2010/12/11/debugging-wcf-data-services:

  1. A missing SetEntitySetAccessRule
  2. A missing pluralisation on the SetEntitySetAccessRule
  3. A missing SetServiceOperationAccessRule

After Debugged EntityAccessRules with simple “*” allowRead just to check that I had not made a typo I quickly found out that I indeed had 😦 So I after I fixed a typo in EdmRelationshipAttribute and it caused the exception. After that stupid mistake things started to look better.

If you need more instructions on how to turn on debugging messages then just follow these instructions:

This is how my service looked after I finally got it running

After I managed to get service defined and running I took a second to make OData feed a bit more readable so that you can consume your OData feed using Explorer and other browsers. http://msdn.microsoft.com/en-us/library/ee373839.aspx

This is how OData feed looks before you tweak it a bit.

For some reason I managed to first add “m:FC_TargetPath” and similar properties to wrong xml element. So make sure you scroll down the file and add it to correct place 🙂

This is how OData feed will look when you tweak it a bit

Another thing that took me couple of hours to figure out was that Explorer does not show all OData results in consistent way. So before you start heavily debugging check the returned html page source code and you should see expected result in XML format. OR you could use another browser. For example this call did not seem to return anything http://localhost:57510/WcfDataServiceStuffILike.svc/StuffSet(guid’cf1bfd2f-99f3-4047-99f8-22bc1aad1b99′)/GategorySet until I checked the source code of this page.

So that’s it. Using Visual Studio this was quite easy and I actually spent most of my time figuring out why some configuration did not work than making the code.  This might be due to the fact that I have “unclean” dev environment or I made lots of changes to above demos while I followed them. This was mainly due to the fact that I wanted to build my own app and not simply type in demos and labs.

I bet If you build your dev environment correctly and follow the labs and demos to the letter you won’t see as many problems that I witnessed. But where is the fun in that 😉

I recently started playing with around with Windows Phone 7 devkit and found it to be surprisingly productive tool set. I managed to develop a simple app within couple of weekends with almost zero previous knowledge on WP7 programming. I learned a couple of things on the way and wanted to tell them to you.

What I did was I started with installing Expression Blend, Visual Studio 2010 and WP7 SDK. I made UI using Blend and did the coding with Visual Studio. Benefits of this were near WYSIWYG experience with WP7 UI and no broken XAML during development.

And now the tips what I learned:

  • Start with the data. Design the data model first because it will help you working with Expression Blend. Make design time data for your blend project and how to trouple shoot it.
  • Trust the Blend. Use it and learn it. I admit that it took good amount of time to find how to make all of the stuff. Some weirdness that I found hard to locate:
    • Databinding to UI items
    • Table you need to add row and column definitions (found under layout)
    • To add more complex list box items than just plain text you need to edit List box item template. Found under list box right mouse click, edit generated Items, edit current.
    • This is how to create menu context items.
    • This is how to create application bar.
    • Use system styles in text boxes PhoneForeground etc. So if user changes the phone theme your app will do the same.
    • DO NOT apply system styles to list boxes! It will break selected styles. Adjust the font size and font or make you own styles that only change font and font size. Otherwise you will spend time on implementing list box selected item visualization.
  • BIND UI to data. Seriously this will help a lot. You should not assign data to UI components. All my weird UI bugs where data is not updating where result of not doing the BIND correctly.
  • Test BIND working on your data periodically. This is because you want BINDING to work.
  • Make your data implement data as INotifyPropertyChanged change so that when data changes it automatically changes on UI.
  • Define data as serialisable.
  • Icons where to find default icons: “C:\Program Files (x86)\Microsoft SDKs\Windows Phone\v7.1\Icons”
  • How to make clear enough icons. You can also use Inkscape with 1028×1028 canvas and convert image to png.

Well that’s for now. I will make follow up posts on things I learn during my ventures on WP7 world.

ps. Here are some screenshots on the app I am working on. Its called “Stuff I Like” and its a simple tool for keeping track on stuff I hear about and might like.

I stumbled upon Red Gate Azure Backup service: https://cloudservices.red-gate.com/

After using the beta for couple of weeks I am really impressed. Everything is easy, UI is intuitive and you don’t need lots of configuration to make this work. You can setup basic daily backup work in just matter of minutes. If you got Azure storage or Amazon S3 account ready. If not then it still does not take long. You spend most of the time digging up usernames and passwords not figuring out how things work! Splendid.

So let’s go through how to do daily Azure SQL backup into Amazon S3 storage.

Red-Gate Cloud Service Dashboard is really clean and simple. Just select the action you want to perform.

First after registering to Red-Gate Cloud Services and login select “Backup SQL Azure to Amazon S3” from Dashboard.

Just fill in the Azure SQL credentials and Amazon S3 keys.

Secondly just fill in Azure SQL server name xxxxxxx..database.windows.net, login credentials to that DB and press refresh button next to Database drop down. Select database that you want to backup. Next click on the “AWS Security Credentials” link and login to your Amazon AWS account. You will be taken directly to place where you can find the keys.

"Access key id" goes to "AWS Access Key" and "Secret Access Key" is shown after pressing "show" link in Amazon AWS.

 Note that you need S3 Bucket. If you don’t have one you can create it from here https://console.aws.amazon.com/s3/home I will not explain how that is done, but it is really easy. After you have inserted Amazon keys you can press the refresh button next to Bucket dropdown in Red-Gate UI. Just fill in the name that you want your backup file to have. Don’t worry about the day stamp because this tool will add date at the end of the filename. Press “continue” button to select scheduling options or backup now.

Just select when you want your daily backup to run or just press "Backup now" button.

 Next just fill in exact time you want your backup in 24h format, select time zone, select week days when to run this and press schedule button. You can also just press backup now button for one time backup. Note that there is also a monthly backup option.

You should see your scheduled backup in next screen. When your first backup is done it appears here under history header.

Next you are taken into schedules and history view. Here you can view details of the upcoming backup operation, cancel it or invocate backup now. Just move your mouse over the upcoming event.

Just move mouse on top of scheduled backup job and you can cancel job or invocate it to happen right now.

When backup job has been successfully completed you will receive an email and a “log” line will appear in “schedules and backup history”.

If job is successful or not it will appear here under history title

 You can view details of executed jobs by moving mouse on top of history line. If you backup job has run into errors you will see an error triangle as well as receive error message via email. Email also contains a direct link to that specific History log line. Neat!

If you backup job has exceptions it has warning icon in history. Details link will contain error message of what went wrong.

 Clicking on the link on the email will direct you to same view as clicking “details” next to completed job.

Here you can read throught what happened while executing backup. If you ran into errors they are also present. Just scroll down from the bar.

If you got your backup run cleanly you should see the created pacbac file in Amazon S3 bucket.

Backup file is safe and sound in Amazon S3

Note that you can just as easily use Azure Storage services or use FTP to upload the backup to you own backup architecture.

There are couple of missing features that I would like to see. For example similar choice than in Red-Gate backup tool to first create copy of database (for transactional safety) and ability to encrypt the backup file before it is uploaded into Amazon or Azure storages. But event without these small features this is still awesome!

Carefree clouding!

I just started to use this SQL Azure and SQL Server Monitoring tool http://www.cotega.com/ and I must say that it is exactly what I was looking for in a easy to use monitoring tool. I don’t need to be complex, feature rich and massive monitoring tool. I only need a alert when database is not accessible, if performance seems to be below average or there is a spike in the usage. Service is stil in beta but it looks really promising.

Just click "add SQL Azure database"

Setting up the database connection was easy. After login you are presented with Dashboard view. Just click “add SQL Azure database and fill in the database address, username and password to the popup. Finally press “Add database button”.

Fill in the address and login details.

You can add notifications to you databases by going to Notifications view and pressing “Create Notification” button.

To add new notification just press "Create Notification"

Just fill the name of the notification, select monitoring rule from “select what you want to monitor”, database and the polling frequency.

Fill in the details and select monitoring target.

After you have selected monitoring target you can select when to create a notification. In this case I have selected “when connection fails”. After this you can fill in a email address and even select a stored procedure to be run. But with connection fails case that does not make sense :). Finally press “Add Notification”.

Add email address and select stored procedure if you need one

There you go. Just start waiting those notifications to kick in.
There is a nice feature in the notifications: When you start to receive lots of them for example conserning “connection fails” you can temporarily disable the notification directly from the notification.

When you get many same type of Notifications you have option to disable the Notification directly from the email.

You can check the logs for specific notification to make sure that rule works.

You can see how rules are being evaluated.

Also you can view this info in a report view for performance ananlysis purposes.

Its easier to spot performance problems from visualized reports.

Very potential neat litle monitoring tool for those who don’t need complex solution for simple problem!

What Azure SQL was missing was a proper supported way to make backups for disaster scenarios. Those scenarios would include loss of control over Azure SQL server or human error causing SQL admin to delete whole Azure SQL server. Great news everyone Azure SQL now has tools to mitigate the impact of these scenarios. SQL Azure Import/Export Service CTP is now available. More details on how this works can be found here.

What do you need?

You need means of scheduling backup, Azure Storage account where to put bacpac files, means to get the exact url of bacpac file and a Azure SQL account to backup.

Azure Account

You need one Azure account with Azure SQL and Storage server of course. 🙂

You might want to have separate backup storage account because, if you cannot access your production account you still have access to your backups. I personally download bacpac backups from Azure to local server.

Bacpac file export tool

In this example I will use Red-gate backup tool. This is because this tool allows you to easily make a database copy of your database before the backup. This will allow you to make transactionally safe backups.

But you can also use DAC SQL Azure Import Export Service Client V 1.2. In this tool you need to make a database copy using for example Cerebrata cmdlets.

Azure Storage account browser

You can use any tool you like. In this example I will use Azure Storage Explorer because it is free 🙂

Windows server

Ideally this would be dedicated windows 2008 server so that you can be sure that it runs smoothly. You can also download exported bacpac files to this server. Just to be safe 🙂

Backup using Red-Gate command line backup tool

Here is a nice how to video how to setup scripts using Red-Gate tool. NOTE that in this video script will make a backup into different database. What we want to do is schedule a bacpac file generation. If you want to test how bacpac file generation works without command line watch this video.

But back to business: your scheduled script should look something like this:

RedGate.SQLAzureBackupCommandLine.exe  /AzureServer:[url_to_azure_server]  AzureDatabase:[databasename] /AzureUserName:[db_owner_username] /AzurePassword:[password] /CreateCopy /StorageAccount:[Azure_account_name] /AccessKey:[primary_or_secondary_azure_storage_key]  /Container:[container_name_in_storage] /Filename:[filename_of_bacpac]

Notice that I did not use any real values in above script. Just fill in the parameter between [ ] and schedule this script to as often you like. Note that you need to change the file name on every run because over writing with a same file name is does not work. I use date+time combinations.

So now we are ready for disasters 😉 Next I will explain how to perform a restore to totally new Azure SQL server.

Restore using Windows Azure management portal

When you need to restore a database to new server you need to have access to bacpac backup file in Azure storage account.

  • Firstly create new Azure SQL server. How to video of that is here.
  • Secondly and this is important add same database logins that the backedup database had. Azure SQL user management is explained in detail here.
  • Thirdly get url of the pacbac file using Azure Storage Explorer. Open Azure Storage Explorer and login to storage account containing the bacpac file. Select the bacpac file and press view button. That is explained in the video of next step.
  • Fourthly you are ready to restore! It can be done like explained in this video.

If you have any comments or questions please don’t hesitate to ask.

Thanks for reading and happy backupping!

%d bloggers like this: