Pandemonium

January 22, 2009

It’s not the size that counts

Filed under: Games Development,Games Industry,Personal,XNA — bittermanandy @ 10:37 pm

Indulge me as I walk down memory lane…

It was summer 2002 and I’d just landed my first professional job. (I’m still not sure how. I wasn’t very good back then. I like to think they saw that I’d become good, and I like to think I am good now. It might just have been luck!). I’d spent some time in the RnD department, to learn the ropes, become familiar with devkits (console hardware with added stuff to enable game development), and understand a bit about how the shared game engine worked; and then the time came to move onto a game team, of which there were five or six to choose from. The choice wasn’t freely given. Each team had different needs, so the question was asked: what did I want to work on? What did I want to specialise in?

I didn’t really know how to answer that. I was vaguely aware of the division between systems programming and gameplay programming, and knew that I preferred the former (I actually think the latter is better solved by a good data-driven system and a good designer, though I have worked with gameplay programmers who produced excellent results); but which system? And how would my choice affect my working day?

Games teams on “AAA” titles nowadays can easily have several tens of programmers. Clearly, if they all worked on whatever they fancied from day to day it would be a disaster. Each programmer therefore gets designated an area of responsibility. Generally speaking, the programming lead and the most senior programmers determine the overall architecture of the game early in development (or it may be mandated by the engine, particularly if it is middleware), and the programming team splits into several sub-teams, each led by a senior programmer who reports to the lead programmer. Examples of roles in the team (in no particular order) include:

Graphics: the celebrities of programming because they get to write code that produces awesome looking screenshots (or at least… code that lets the artists do so). Every time the publishers come for a visit, they’ll get led into the graphics programmers’ office and shown all the latest particle-laden explosions on flashy HDTVs. Graphics programmers spend a lot of time writing shaders, optimising the renderer, and talking with artists.
Networking: modern games are immensely complex and with multiplayer online being practically compulsory nowadays, every game will have network specialists. They tend to spend all their time trying to teach other programmers how to write code that doesn’t break the online mode, for example by sending 600KB packets every frame or updating something on the local client but not the game server. They usually look a bit stressed.
Physics: even with middleware like Havok (or in XNA, JiglibX and the like) available, physics remains one of the most complicated things in a game because it affects just about everything else. One game I worked on had five physics programmers.
AI: most games have enemies or some kind of non-player entity. While scripting and other designer-facing tools mean AI is not as hard-coded as it once was, someone’s got to write the code that interprets the scripts – that’s the job of the AI coder.
Audio: given that the only two ways your game can influence the player is via the screen and the speakers, audio is half of every game. Unfortunately it’s the second half (because it doesn’t look good in screenshots) and all too often audio is neglected. Done well, it can turn a good game into a mind-shatteringly atmospheric epic. Audio coders spend a lot of time talking to the musicians and SFX engineers, and they’re usually slightly bitter that the graphics programmers get all the plaudits (and flashy HDTVs).
Tools: there might be twenty programmers on a team, but there might be ten designers, fifty artists, and five audio engineers (as well as testers, producers, marketing, translators…). You can’t just give them a copy of Photoshop and Maya and tell them to get on with it. Every game needs specific tools that enables these people to get their assets into the game and tweaked until fun and in my experience, the better the tools the better the finished game.  Historically tools were DOS-based and unreliable; increasingly, they’re now written in C#/.NET and actually work more often than not. Tools programmers are the unsung heroes of the programming team. Their work is almost never seen by the public, but without them, the game itself won’t get seen by the public either.
Systems: asset loading. The game camera. Multithreading. Save games. Text, and menus. Achievements. DVD file layout. TCR compliance (rules that the console makers require you to obey before your game can be released). Support for steering wheels, dance mats, webcams and chatpads. The build process (putting together versions of the game to give to artists, management and testing). A veritable pot pourri of tasks that no game can go without. Some of these tasks will be given to the most senior programmers because they’re critical to the game’s success. Others will be given to the most junior programmers because they’re relatively self-contained and can be developed in isolation. Most don’t get noticed, until you try to make a game without them!

There’s more, but that’s a good initial summary. (Just think – when you write an XNA game, you’re responsible for all of the above! Lucky XNA itself is brilliant at doing it loads of it for you). My first ever task on a game team was to write the game camera, and looking back I’m pretty proud of how it turned out. Very soon afterward I also took on the audio programming. Eventually, on that first game (I worked on it for three years – some were working on it for twice that), I was responsible for, or otherwise involved with, text and menus, localisation (making the game support other languages), asset loading, the build process, save games and TCR compliance. Later I’d work on asset optimisation and arranging files on the DVD, and probably some other stuff I’ve forgotten. So it became clear: my specialisation was that I was a Jack-of-all-trades. Hurrah!

With all these people working on different things, it’s critical that they don’t interfere with one another by writing over one another’s changes. This is achieved by use of a Source Control System. Basically, this keeps track of every code file and asset in the game, keeps records of how they change over time, and tries to ensure that if two people make changes at once, those changes are seamlessly merged together. Different teams use different products and approach this in different ways. Some examples using the codenames of games I worked on:

Game Two: used CVS for source control. This is an horrific abomination that should be scourged from the earth. A coder would make all the changes he wanted, send out an email to the team saying “please don’t commit”, commit all the files he’d changed, and send another email saying “OK to commit”. Inevitably he’d have missed something so the next coder to update would have to come and ask him to fix the problem before they could continue. Worst of all, the artists couldn’t get anything into the game without giving it to a programmer to commit it for them. This was frustrating for everyone involved and meant that changes to a level, for example, could take two or three days to get into the game. Putting together a build was an eight hour manual process; all too often it wouldn’t start until 5pm the night before a deadline. I’m not completely sure how we managed to get the game finished, and I’m stunned that it turned out as good as it did despite everything. This is How Not To Do It.

Pocket/Pikelet: an improvement in every area, these teams used Source Depot (basically the same as Perforce) for source control. No more emails to control who could commit – a tool that lived in every developer’s System Tray would lock out commits while someone was going through the process of updating, building the game, running the unit tests and committing. A separate build machine would then automatically update to that version, run the tests again and if they were passed make the build available to all the non-programmers – who, incidentally, had the tools available to do all their work without needing to go through a programmer. It was brilliant, broken builds and artist downtime were unheard of – the game was bulletproof throughout development. There was only one small problem. Updating, building, running unit tests and committing took half an hour or more. In a normal eight hour day, only sixteen programmers could do it – at most. With twenty programmers on the team, there’s an obvious problem and it got very frantic near deadlines – and leaving at least a day between committing meant that you’d commit too much at once, introducing bugs (which would fail the unit tests and delay your commit even longer). This was a problem, but in general this was the best system I’ve had the pleasure of working in.

Polished Turd (not the real codename, just what I call it): again this used Perforce for source control, except this time without a formal commit queue like Pocket/Pikelet. The commit queue had originally been introduced to stop CVS from breaking everything, but Perforce is so much better than CVS that actually things rarely break even if you’re free and easy with committing. It meant that you could fix a bug, commit your changes, fix another bug, commit your changes, and iterate very rapidly through your work. You’d update over lunch and overnight, or when you noticed someone had committed something you need. If we’d only had the same team build system, unit tests, build suite, and artists tools from Pocket/Pikelet this would have been the ideal way of working. (Unfortunately all those things didn’t exist so the game was unstable, the artists couldn’t do their job, and every deadline was a mad scramble to put together get something vaguely playable that lasted more than five minutes between crashes).

Now, most people reading this will be hobbyists working in XNA on their own. At the moment, that applies to me too. However, I’m uncomfortably aware that the big blue blobs and soulless grey polygons that currently represent the actors and props in my game won’t inspire other people to play my game – and I’m closer to being autistic than artistic. So sooner or later, I’m going to need other team members to make models, textures, animations and sound for me. (Any volunteers?) And, while I’m currently planning to save all the coding duties for myself (with the XNA Framework itself and the multitude of excellent community libraries out there, this is a realisitic proposition in a non-trivial game for really the first time this side of about 1998) many of you will no doubt be thinking of forming small coding teams, to split the work among you. (I don’t blame you. Just look at the list above. Even a simple game has a lot to do…) So – how well does XNA support medium-to-large game teams? And what would such a team require?

Source control is absolutely essential the moment your team grows larger than one. And, with a team of one or two, there’s only one choice – Perforce, which is free for up to two users, and so good I use it even though no-one’s forcing me to. Unfortunately, as soon as your team size hits three people, it’s something like $700 a license. Ouch. Worth absolutely every penny and more if you’re a pro developer with your outgoings covered by a publisher, but out of reach of everyone else. I am told that Subversion is good for a free product, but I look at the lack of atomic changelists and cringe. Probably the next best if you can’t afford Perforce though. There’s nothing here that’s different for XNA than if you were using C++ or any other language.

It will be essential to have a configuration of your game that allows your artists to put their assets into the game without waiting for you to do it for them. The XNA content pipeline is a wonderful, wonderful thing of great beauty, but by default it is tied into Visual Studio – the content pipeline build occurs just before you run your game, and that’s it. An artist doesn’t want Visual Studio, and he doesn’t want to restart the game every time he changes a texture – he should be able to save off the texture file and see it change in-game. So you need to provide them with a version of the game that runs on its own (perhaps in a WinForm) and a tool that hooks into the content pipeline and will let them build assets while the game is running, notify the game, and the game then reloads that asset. This is pretty easy for assets that already exist in a content project, but the tool will need to support adding to the content project (hidden from the artist) and even creating new ones. All this needs to hook into your source control system without your artist needing to learn to type “p4 -d -a -q -z” or whatever.

The content pipeline causes a few other problems too. (Though don’t think for a second that I’m knocking it. What it does is amazing). Big games tend to generate thousands, even hundreds of thousands of assets – and in the content pipeline, each of those is a different file. People often don’t realise that opening a file, even from a hard drive, can sometimes take as long as a quarter of a second. That doesn’t sound like much until you multiply it by a hundred thousand. (You can very easily see this for yourself by using WinZip to zip up one large file of say 10MB, then compare it to zipping up a thousand files of 10KB each – I guarantee the latter will be much slower). To give an XNA-related example: Kameo, Pocket and Pikelet all used XACT, the same audio tool as provided with XNA (and which I’ve used very extensively). Pocket, in particular, had well over 200 soundbanks which all needed loading at startup. Even though they were only between just 1 and 4KB each, this took a long time to load from the DVD – so long that the 20 second load time mandated by Xbox 360 TCRs seemed an impossibility. The fix was pretty simple: we altered the build process to package all those soundbanks into a single file, loaded the file into memory, and loaded the soundbanks from that memory. Instantly the problem was solved. The same solution won’t work in XNA: there’s no method to load soundbanks from memory. It expects each soundbank as a separate file, and that’s that. A big game in XNA would really struggle under such restrictions. (Loading on a background thread won’t help, 20 seconds of disk access is 20 seconds of disk access and that’s that, though admittedly an XNA game would be on a hard drive not a DVD which helps a lot). You’d have to plan ahead very carefully – in terms of the game design, not just code; which is hard, because designers don’t understand arbitrary restrictions from code, and nor should they have to – to ensure that there was never a need to load so many individual files all at once.

With these caveats in mind I’m convinced a medium-to-large game can realistically be made in XNA. The “poor performance” of C# compared to C++ is for the most part a myth, though to get the absolute 100% optimum out of the hardware might require C++ – but you could get at least 95% of the way there with C#, and 98% of all games don’t need 100% performance. It’s unfortunate that the biggest win for hobbyists and small-team endeavours – the brilliant content pipeline – appears to be the biggest limiting factor for large teams. The issue of when and how assets are built, and how they can be rapidly iterated without restarting the game, is definitely solvable, though with a fair bit of coding effort; the fact that so much of the pipeline relies on a one asset, one file relationship is trickier. A hundred-thousand-asset game would need a hundred thousand files, and loading times would be horrific.

So really large games might not be easy in XNA. Happily (?) few large games nowadays are single platform, and since XNA is not available on PS3 and Gamecube, I don’t think many large teams will be evaluating XNA anyway. Medium teams, independents, and small hobbyist teams can definitely have a field day with it (though profitability of XBLCG at least has yet to be proven). Just remember that the limiting factor for such games is content, and put a lot of effort into making your artists’ lives easy. If they can make a model, put it in your game, play with it, tweak it, play with it again, tweak it, play with it, and check it into source control, they’ll beg to be allowed to make more. If they have to send you the model and wait two days for you to send them a build with it in, they’ll give up within a week. Frankly – even if you’re working on your own, and doing all the code and art yourself, a little bit of work to make good tools will make your own life easier; and isn’t that what we all want?

Advertisements

14 Comments »

  1. Nice article. This is exactly what i always wanted to know, how it looks from “the inside”. Thanks 🙂

    Comment by Ginie — January 22, 2009 @ 11:02 pm | Reply

  2. Great post.

    The XNA content management system is a true nightmare, and I really wish it hadn’t been hooked up as a property on the Game class. The first thing I did was rip it out and program my own!

    There are a lot of “kludges” within XNA in regards to content – specifically the dreaded “internal” constructors that make things like SpriteFonts impossible to use without the XNA content system. I really wish the XNA team would “see the light” in regards to indie and medium-sized projects and make the core classes a bit more versatile in that regard.

    Comment by ShadowChaser — January 23, 2009 @ 12:33 am | Reply

  3. With regards to Content Pipeline: First of all, XnaZip provides a simple library to have the Content Pipeline load from Zip files rather than collections of .xnb files. It’s 360 compatible with a third part add-on:

    http://www.codeplex.com/XnaZip

    Secondly, I think its only a matter of time before someone makes a Content Pipeline tool targeted to artists (that is, not embedded in Visual Studio), possibly even Microsoft themselves if enough customers demand it as a high priority. I’m not sure if you are familiar with the Expression suite, but one of the tools is Expression Blend and it is an obvious comparison. Blend opens an MSBuild file (.csproj, .vbproj, etc) and lets a designer edit the XAML files in it (or add new XAML files) without ever touching Visual Studio in a more “Photoshop-like” or “Flash-like” environment tuned to a designer’s role. I certainly could see such tools eventually built for XNA. (It shouldn’t even be terribly hard… MSBuild has a GACed .NET API that is pretty simple to write against from what I hear.)

    As for source control: “Source control is absolutely essential the moment your team grows larger than one.” I think its essential for even teams of one and that everyone should find a source control system that they are happy with. I suggest trying a distributed version control system (DVCS). DVCS’ are the new state of the art, and all of the good ones are open source and free. DVCS’ generally remove any and all issues of “locking” or “commit management” in the sense of “don’t commit yet” or “don’t commit to file x unless it is a full moon and a Thursday, unless that Thursday happens to fall on a leap year in which case you can’t even commit during business hours”. Mercurial is pretty cool, Bazaar has its following, and Git is stupid but has a huge growing popularity and is used by more and more very large projects (Linux kernel, Perl, others…). Personally, I prefer Darcs which has the smartest merging and nicest user interface of just about any source control system ever… All of them should be reasonably quick to pick up and learn and I think even dumb old Git is a leap above traditional source control systems like SVN or even Perforce in terms of enabling strong collaboration and allowing developers to save their progress as they make progress.

    Comment by Max Battcher — January 23, 2009 @ 8:48 am | Reply

  4. “If they can make a model, put it in your game, play with it, tweak it, play with it again, tweak it, play with it, and check it into source control, they’ll beg to be allowed to make more. If they have to send you the model and wait two days for you to send them a build with it in, they’ll give up within a week.”

    Exactly what happened here at the Uni. I found some good artists for the Department of Digital Arts and made an attempt to create a game with ΧΝΑ Game Studio. It was a small effort, with about a dozen assets (models). The largest bottleneck in the process was, predictably, the art pipeline. The artists had to send me the Maya files, I had to convert them to .fbx, import them into my ModelViewer, grab screenshots and send them back! Granted I could give them the ModelViewer, but I would have to install XNA Game Studio on their PCs as well. I shouldn’t have to do that, they are users, not programmers!

    I believe that managed C#-type solutions are most probably the future in videogame development. Processors are becoming so fast that it won’t matter anymore that you can’t achieve 100% processor utilisation. Hassle-free memory management, easy threading, and easy content management is key to successful videogame development.

    XNA points to the right direction, it is not there just yet though!

    Comment by thinkinggamer — January 23, 2009 @ 9:07 am | Reply

  5. Good to see you back and writing 🙂

    I didn’t realise Perforce did a free edition for personal use. I may have to look into that. I use it daily in a 100+ user environment. I’ve been thinking I a source-control system for my own personal endeavours, so given I’m already familiar with it, it might be worth a shot. How do you find the VS integration? I find it is good for generaly check-out/edit/check-in use, but for things like renaming it gets very tricky.

    Comment by JJ — January 23, 2009 @ 12:14 pm | Reply

  6. The tools programmer comment is spot on! I do feel (at times) like the unsung hero you speak of (working in a large game company)… The part about small files taking lots of time to work with is also spot on, that’s why we’re using a system based on “Big Files” (one huge file using its own file system/FAT so when you update a bunch of files, it’s all done in memory basically), but this causes a bunch of other problems (have to write your own accessors between your SCM and your data…). I guess this could be an article in itself (ways of structuring your game assets).

    Anyhoo, love to read your articles, great insight on the industry and might even motivate me to pick XNA for real! Thought I’d share so you can continue posting your musings!

    Cheers

    Comment by Nick — January 23, 2009 @ 3:55 pm | Reply

  7. Source control is an amazing tool to save your ass from yourself when you’re a team of one. It’s also a fantastic source of automatic backup! I personally recommend Fortress (or Vault) from SourceGear. It’s free for a single user. It’s easy to configure, well-integrated into Visual Studio, and it’s never let me down. You can even access your code over the internet if you choose to.

    Comment by Josh Usovsky — January 23, 2009 @ 8:19 pm | Reply

  8. I am constantly amazed of the amount of students I come across that have not used source control, and the universities don’t teach it.

    Thanks for another insightful blog!

    Comment by paulecoyote — January 26, 2009 @ 11:31 am | Reply

  9. When you make commits and builds using a build server, is this the only allowed way to compile code meaning that the programmers have to make code changes, then wait 30 minutes for the whole thing to compile, test and run?

    Or does it only do the build and test process when a programmer commits their changes to the rest of the team?

    Again, thanks for your often very interesting blog entries. I’m a computer science undergrad and you’re giving me a better idea of what the games industry is like on the inside!

    Could a request a post about World/Entity/Component system, or more generally how big games are structured so that the code doesn’t turn into a disorganised heavily interconnected mess.

    Comment by zander — January 30, 2009 @ 5:28 pm | Reply

    • 30 minutes to compile, test and run would not be very efficient, and I’m all in favour of efficient development, remember? 😀

      No… how it works is, there is a central store (“repository”) for all the source code, source assets, etc. This is somewhere on a network server (or even on the internet!), usually with an automatic backup running. When a new team member joins, and after installing Visual Studio etc, they initially “update” their view of the project. This takes a copy of everything that’s in the repository, and copies it to the user’s local machine. From that point on, you can do whatever you want to what’s on your local machine – change it, compile it, test it, whatever – without affecting anyone else. When you change a file (say a .cs file), it “checks it out” – that is, tells the server that you are editing the file. As the server knows who has checked-out what, other people can see who is working on which files, and if necessary particular files can be “locked” from being checked-out by anyone else. (Note: not all source control systems support checking out, some just let you change stuff at will. The good ones support it – checking out has many benefits).

      Once you have reached the stage where you have made a set of changes, and tested them, and think they are ready for the rest of the team to use, you “check-in” (or “commit”) your changes. (If someone else has checked-in any of the same files since you last updated, you’ll need to update your files first, and “merge” them – which may result in a “conflict” if you’ve both changed the same part of the file). Checking-in takes a copy from your local view, and puts it in the repository. Each check-in of each file is marked with a file “version” number, and the differences (“deltas”) between versions are stored in the repository too – so, I can see that on the 13th of November, I checked-in version 6 of Pandemonium.cs, and the change I made was to fix a bug where I’d missed #if WINDOWS around some Windows-only code.

      Once your check-in is complete, other team members can update, if they wish. If it’s an urgent fix, they may all update straight away. If it’s a non-urgent but still important fix, they may update over lunch or overnight. If they’re in the middle of making significant changes, they may not update at all until right before they’re about to check-in themselves. So, everyone gets to work in isolation for as long as they want to, but are able to access everyone else’s changes whenever they want to, too.

      The trick with the build machine is that it will attach a service to the source control server, and every time it detects a check-in, it will immediately update, build, and test all configurations of the game. (Overnight, it will rebuild instead of building, to make sure nothing is stale, and process the big things like levels that can take hours to convert). If anything goes wrong (eg. the code doesn’t build or a test fails), it sends an alert to the team saying “Andy broke the build, don’t update yet! Andy, please fix it immediately!” On the other hand, if everything passes, it marks the check-in as a good one, applies a “label” (a snapshot of all versions of all files), and stores a copy of the last few good builds somewhere that the artists and designers can get to them, and sends them an alert saying “new build available”. This way, the artists and designers are protected from bad changes to the code, and giving them access to any of the last few versions protects them even from bugs that pass all the tests but still cause problems. It also means that at any time there is a working version of the game you can give to any publishers, producers, E3 demos, surprise journalist visits etc. – you never have to put the whole team on freeze while a demo version is built.

      I could write more – about changelists, and branches, and private branches, and how to split a repository into workspaces for different teams, and more stuff you can do with the build machine, and so on – but that should give you a pretty good idea. If you’re interested, Google products like Perforce, Subversion, or CVS (from best to worst, or there are many others).

      Also in case I didn’t make it clear, and as Josh already pointed out, source control is even useful if you’re in a team of one. I use Perforce, and there’s no-one else in my team. It means that if you find a bug, you can see when the bug was inserted into the code, what else you changed at the time, why the change was made, and – if all else fails – you can undo the change by “rolling back” to a previous version. It’s pretty much invaluable and, like Paul, I’m amazed they don’t cover it at University.

      Comment by bittermanandy — January 30, 2009 @ 7:53 pm | Reply

  10. Subversion has changelists in the latest versions (1.5+), but I’m not sure what do you mean by “atomic changelists”? All commits are atomic in Subversion, and changelists are simply a way to organize commits (keep several sets of changes inside the same working copy).

    Subversion is fine for what I’d call medium-sized projects – we have a repository of ~150 GB spanning several projects, with about 20 GB of art assets per project, with an art team of ~30. Works pretty well.

    Comment by Ivan-Assen — March 14, 2009 @ 3:12 pm | Reply

    • An atomic changelist is one where a whole bunch of files are committed together. The commit can be undone, but if so, it applies to all files in the change list. This way you can’t ever get half an update… single-file atomic commits provide no protection!

      Comment by bittermanandy — March 14, 2009 @ 7:45 pm | Reply

  11. Subversion has this, but they call them “atomic commits” – it was one of the major design points, “CVS with atomic commits”. If you commit a.cpp and b.cpp together, no one ever will see only one of them changed, no matter when they update or revert.

    Comment by Ivan-Assen — March 16, 2009 @ 3:02 pm | Reply

  12. Yes, I think you may have confused CVS and SVN there. SVN definitely does multi-file atomic commits. After a hard drive crash taught me a lesson, I’m using xp-dev.com as a distributed SVN host (I have a GB for free there, although they’re removing free SSL access soon, which is a shame). I also make sure to use my free mozy.com account to backup my local sourcecode.

    There’s no excuse not to use a source control solution, especially when you can even get Visual Studio integration now (I use AnkhSVN). The best thing is that everything is either free or very cheap. I look back on things and realise how fortunate we are to have this level of support, along with XNA GS itself and the Framework. We’ve come a long way from Net Yaroze and dodgy SDKs.

    I really enjoyed reading your article and others here. I hope you’ll keep sharing your knowledge with the rest of us and I look forward to the rest of your camera-focused series.

    Comment by Dave R. — May 6, 2009 @ 5:46 pm | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at WordPress.com.

%d bloggers like this: