This page is intended to be a sort of liner notes for my contributions to Spore. It's a place for me to write up miscellaneous development comments about the parts of the game I worked on, while they're still fresh in my mind. I think the game had over 80 people working on it towards the end, and it was in development for more than 5 years, so basically everything in the game was touched by more than one person and was a team effort. Given that, I will strive for inclusion and accuracy, and I will only talk about systems to which I made substantial contributions. If I've left somebody out or made a mistake, I apologize; please email me and I'll correct it immediately.
A Brief Note on Game Credits This page could be seen as augmenting the game credits, like what you'd get if you could click on the names to get more information. I think credits are very important for game developers, our industry, and the art form. Although assigning credit is a bit tricky because large-scale game development has a lot of subtle, overlapping, and often blurry responsibilities, I still think it's interesting to have rough descriptions and color commentary of the major things each person worked on. This is my attempt at this for my contributions, before I forget the details. I hope more developers put up pages like this for the games they work on.
Almost all of my contributions center around the creatures, and helping to bring them to life for players. My favorite moment in the game is when you first attach a leg or arm or mouth to the torso in the creature editor, and your creature comes alive and turns to look at its new limb or roars with its mouth. It's great to watch players be delighted by something they just created.
In most cases I'm limiting my discussions to the specific code and tools that I contributed to directly, which leaves out descriptions of a lot of the incredibly important and groundbreaking work other people did on neighboring code, whether higher level, more player facing aspects of the code, like user interface and gameplay, or lower level systems stuff like resource management, threading, rendering, and the like. It also leaves out the contributions of the awesome artists who used the [sometimes cantankerous] tools and systems.
These sections are in rough chronological order of my contributions.
Creature Skin Mesh
The first thing I worked on when I started on Spore (in October, 2003) was the creature skin. Unlike games with fixed characters, like a James Bond or Lara Croft game, or games with parameterizable meshes, like the Sims or City of Heroes, Spore has to generate the entire mesh on the fly as the player makes the creature. While a typical game's fixed or parameterizable character mesh might be worked on for days in a 3D modeling tool like Maya by a professional game artist, Spore needs to regenerate the skin in real-time as the player deforms the torso and attaches and detaches limbs. I chose a blobby implicit surface (sometimes called metaballs) to represent the skin.
Implicit surfaces are interesting because they are very topologically robust, yet they lack "local control". This lack of control compared to modeling with triangles directly (called a boundary representation, or b-rep) is one of the reasons implicit surfaces haven't really ever gained traction for regular 3D modeling—when you're making James Bond, you want to get his nose just right. However, the topological robustness is actually more valuable to Spore than the local control, since we have a higher level user interface on the creature creation. We don't let the player manipulate at the polygon level anyway, both because we would end up with an interface as complex as Maya, but also because we need to keep the "recipe" for the creature very small so we can transmit it over the wire to and from the Sporepedia. The deltas for per-polygon editing get big, fast. Robustness, on the other hand, is vitally important, because we let the player have so much control over the creature morphology, and the skin always needs to flow smoothly over the skeleton of the creature.
Here are some images of the first "creature" I built with the metaball skin system, showing the skin, wireframe, the metaballs, the voxelization, and a 3D print of the creature.
The incredibly awesome Henry Goffin took over this code after he joined the team and did almost all of the hard work of getting the skinning system into the creature editor and game.
Some interesting details:
- We use a 4th order polynomial in the squared distance from the sample point to the center of the given metaball for the implicit surface, similar to Triquet, Meseure, and Chaillou. They use a 2nd order polynomial, but we square the main term again to get more continuous derivatives to avoid lighting discontinuities. The actual equation is:
ci is the metaball center position, Ri is the metaball radius (the function is defined to be 0 outside this radius), and si is the scale factor for the metaball, affecting its goopiness.
- When I started the skin work, the Marching Cubes patent was still in effect, so to work around the patent I tessellate the surface into triangles using an ear clipping algorithm. The patent expired before we shipped, but I never went back and tested whether using Marching Cubes would have been faster or slower.
- We did not have time to implement metaball groups, which means the skin is one big implicit surface. This allows the skin on limbs to join together, forming webbing between the limbs as they move independently. We intended to fix this, but players have used this as a feature to create bat wings, and other special effects. Since players found it valuable and it's easy to work around while editing by simply moving the limbs apart if you don't like the effect, we probably won't change the current behavior. I sum up this slightly odd chain of events with:
Bugs + Player Creativity = FeaturesRandom tidbit: some people internally called this the "flying squirrel bug".
- A lot of naive implicit surface tesselation algorithms generate poor quality triangle meshes with lots of slivers. We avoid this by using the amazingly awesome technique from Compact Isocontours from Sampled Data, Moore and Warren, in Graphics Gems III. There is a tech report version of this available online: Mesh Displacement: An Improved Contouring Method for Trivariate Data. My friend Casey Muratori introduced me to this paper a while back, and it is the secret to high quality implicit surface tesselations. Few people seem to know about this technique, but it's trivial to implement and generates perfectly uniform meshes as you can see in the screenshots above—highly recommended!
- We only use spherical metaballs, and we distribute them along the limbs and torso using a neat bit of math that calculates how close they need to be to form a smooth shape based on the implicit surface parameters. I'll write this up at some point if people are interested. Ellipsoidal metaballs would be cool and allow a wider range of shapes, but they seem like they'd be significantly slower to evaluate because they're orientation dependent, and evaluation speed is crucial to keeping the creature editor snappy.
- We generate bone weights for the vertices based on which body parts generated which metaballs. This works well for limbs, but sometimes big spine segments don't generate smooth weights, and the torso on fat creatures can shear. Fixing the torso skin vertex weights is relative easy and Henry and I prototyped a solution right before ship, but dealing with any torso-attached parts is harder, so we had to punt. We hope to fix this at some point when we work out the details.
- For the "parts" of the creature, we use pre-authored Maya models called Rigblocks, with lots of parameterized deformation handles to change the shapes. Andrew Willmott presented a SIGGRAPH 2007 Technical Sketch on them, and Art Director Ocean Quigley put up a post about the earliest mockups for rigblocks.
a.k.a. Skin Paint
Again, constrasting Spore with games that use fixed character assets is illuminating for the various challenges we faced when texturing the creatures.
All of this skin paint development was done in tight collaboration with Spore's Art Director Ocean Quigley. We worked closely to figure out which degrees of freedom we needed to expose to give the creatures convincing organic skin, and how we'd fit it onto existing 3D hardware. Ocean has a detailed post about the skin paint system on his blog, and one of Andrew's SIGGRAPH 2007 Technical Sketches discusses skin paint as well.
Texture Charting As mentioned above, for a normal game, the character mesh is fixed early in the process, and that mesh is then charted (also called "uv'd" in reference to the use of 'u' and 'v' for texture coordinates) to create texture coordinates for all the triangles in the mesh. The final set of texture coordinates is called a texture atlas. Intuitively, we flay the 3D mesh apart, like skinning an animal, until it lays flat on a plane. There are mathematical theorems discussing the difficulty of doing this with minimal distortion, and the plethora of map projections shows there's no "best" way to do this, and so artists will adjust and tune the mapping while the character is being created, until they're happy with the result. This might take a few hours or a day.
By contrast, Spore needs to create a texture atlas almost instantly, because the player can modify the mesh and then go into the editor's Play Mode or Paint Mode and see the results immediately. I wrote a very simple but optimized charter that does a really crappy job compared to a professional artist, but it does the job in 10 milliseconds on our minspec platform. Here's an example of an atlas produced by the charter:
Notice all the individual triangles at the bottom, and the wasted space. You can probably determine the algorithm if you look closely at the two images. The code starts on a random uncharted triangle, then floods outward as it finds neighboring triangles facing in approximately the same cardinal direction in 3D (meaning down the +x, -x, +y, -y, +z, or -z axes). When it can't find any more uncharted triangles attached to the current group and facing the same direction, it projects the group flat onto the plane of the current axis. This guarantees bounded distortion. After charting all triangles, it packs the charts into the texture atlas by sorting them by bounding box size, and then packing the bounding boxes of the charts in 2D texture space in a zig zag manner.
Constrast the frog above with this Oliphant texture from Battle For Middle Earth:
This texture atlas is incredibly well packed and distributed; all of the texels are used with no wasted space. Ocean once quipped that he'd fire any artist who made a texture atlas as bad as that frog's map. I replied that it was indeed a poor quality atlas, but it was maybe only 10 or a 100 times worse than a human might make, and it was created about 36,000 times faster, so my code is still ahead by a factor of 400 or so! Plus, it's quite difficult to squeeze a professional texture artist into each shipping game box. That said, the poor packing does have some impact on the final result, because wasted texels means less apparent texture resolution for a given amount of texture memory used, and lots of texture seams means both slightly bigger meshes because of lack of texture coordinate sharing and potential for visible seams on the skin. It turns out we didn't need the charter to be quite as fast as it currently is, so Henry and I mused about how to trade off a little bit of speed for better charting and packing, but it was unclear how to do this in the development time we had left. The high end techniques in most graphics research papers about texture charting take far too long for our use case, usually on the order of seconds or minutes.
The key difference between a Spore creature's texture atlas and a fixed game character's atlas is a human artist will paint into the latter, so it has to be made from recognizable pieces and it has to be coherent; individually charted triangles make no sense when you're painting into a map directly in Photoshop. Because Spore uses a procedural paint system, only the computer ever needs to paint into the texture maps for Spore creatures, so as long as the code can figure out where things belong, it's okay.
Lowlevel Texture Painting At the lowest level, the paint system renders multi-channel brush textures into the creature's 2D texture map. The initial prototype for the skin paint system was a "3D paint" system, allowing artists to use the mouse or graphics tablet to paint directly onto the creature mesh with a palette of brushes. At the time I was developing this code, all the existing 3D paint applications, like Maya and ZBrush, used really clunky projections that just didn't feel right. I took what I thought was the simple approach: raycast the mesh to find the brush's hit location, flood fill out on the mesh using the brush size to collect all the triangles hit by the brush application, and then render into the texture using the uv coordinates as the draw coordinates, the 3D coordinates as the texture coordinates, and the brush as the texture, while the texture matrix set up the brush's angle in 3D. I also generated "skirt polygons" around the actual mesh triangles in both 3D and texture space, and these were added to the collection of hit triangles if necessary. This avoided seams while painting, since a chart edge would have data beyond it when the texture resampler fetched neighboring texels for bilinear resampling. Finally, the texture was dilated so mipmaps would not show seams. All of this was hardware accelerated, of course, so it was silky smooth. It worked great, felt completely natural, and the artists loved it. Since then, most of the professional paint apps have adopted similar approaches as well, thankfully.
One cool thing about our paint system is the brushes could write into multiple channels simultaneously. In the prototype, we supported diffuse, specular exponent, gloss, emissive, and bump channels, with alpha masks for each. The bump channel was differentiated into a normal map after every paint application. You could have a brush that only wrote into the specular map without touching the other channels, for example, to make some parts of the skin look wet. We used the Photoshop PSD library I wrote for the Indie Game Jam to load the multi-channel brushes directly, so you could pack all them into a single Photoshop file, which was great for workflow.
Once the 3D paint prototype was up and running, our phenomenal artist John Cimino did a bunch of testing with it to prove out the system.
Particle Paint After the 3D paint prototype proved out the core technology, we needed to make it procedural, since we couldn't figure out a way to ship Ocean and John with each copy of the game. The lingua franca for data-driven procedural systems at Maxis is Andrew Willmott's Swarm effects/particle system. I wrote a plugin for Swarm that hooked up the paint system to the particle behavior, so Ocean and the other effects artists could write scripts to try to paint creatures. This hookup was interesting because I exposed the 2D mesh surface to the effects system as a constraint surface, and the particles were actually simulated in barycentric coordinates on the mesh itself. This allowed the particles to move arbitrarily fast, and they would zip right around the mesh with no gaps.
Here's a video of the skinpaint system at work in the prototype on a test mesh:
As you can read about in Ocean's blog post, this system proved out the concept, but more work was necessary. Henry took it from here, and replaced the Swarm hookup with a custom system that allowed the scripts to reason about the morphology of the creature, doing things like giving the particles limb-relative coordinate systems, etc. Henry also ported all the lowlevel paint code from the OpenGL prototype to the RenderWare/Direct3D game engine, and made it run in the background, a thankless and difficult task!
I've written and lectured a lot about the creature animation system, so I'll just point to those resources here:
- Real-time Motion Retargeting to Highly Varied User-Created Morphologies, the SIGGRAPH 2008 Technical Paper about the Spore Animation System. Very detailed and technical. Includes videos of Spasm, the animation tool.
- How To Animate a Character You've Never Seen Before, the Game Developers Conference 2007 lecture on the system. Much more accessible, less detailed.
I tried to include a cute video of an animation John Cimino did of Spore creatures singing the Mahna Mahna song, but YouTube flagged it as copyright infringing. Damn you, DSP algorithms!
Behavior Tree AI
Early in production we were running into trouble with our existing Artificial Intelligence (AI) system for creatures, so Tom Bui, Lauren McHugh, and I did a bunch of research and decided to switch to a Halo-style Behavior Tree (BT) system. Damian Isla documented Bungie's system at GDC 2005, and it was reprinted on Gamasutra. I spent a lot of time talking with Damian, Max Dyckhoff, and Chris Butcher about their system, what worked, what didn't, and how they'd change it, and ended up creating our BT system as a kind of "version 1.5" of theirs. I'm trying to get permission to release it to the public domain, but for now, here is the main chunk of the documentation, which can be read as a sequel to Damian's article linked above.
- /Spore Behavior Tree Docs The first part of a kind of pseudo-literate-program describing how our system works.
- Lauren gave a lecture at the 2008 GDC on our system as well, but I believe only the audio is available right now.
What are Behavior Trees? Various people have asked me where I think Behavior Trees fit into the taxonomy of game AI techniques. I characterize them as still firmly in the Finite State Machine category. The magic of Behavior Trees is not that they're a new technique, but that they have such good development process properties—meaning the friction for using them is very low—that they allow you to scale your AI complexity far beyond the other FSM representation methods I've seen, including switch statements, objects for states, graphical node editors, etc. In effect, they allow you to put off the (inevitable?) transition to real planning with all of its implementation and debugging issues, by allowing you to continue to use FSMs for more and more advanced AI. BTs have limits as well, and I do think game AI will be planning based in the future, but putting off dealing with an opaque planner is a good thing. I liken it to hotloading data files: hotloading is just a process improvement; it doesn't change your data or code much; it doesn't directly make your game better. But, it's such a massive process improvement it becomes a revolution in productivity by allowing you to iterate faster. BTs are the same kind of thing for AI. They're so simple to use and debug, and have such good scalability characteristics (avoiding the n2 issue as mentioned in the docs, etc.), that they allow you to vastly increase your AI's complexity without running into a wall.
Early in Spore's development I was involved in prototyping designs for the creature game, and eventually ended up co-leading the creature team for a year or so with Alex Hutchinson, before giving that up to focus on the animation system for the last year and half of the project. Most of the prototyping work was centered around finding cool ways to use the creature animation system.
Here's a fun video of an early prototype:
Cute vs. Science Since it just seems to keep reanimating like a really bad zombie movie, even though it's been called nonsense by me, Executive Producer Lucy Bradshaw, and Will Wright, I will simply state clearly that at no point did I try to steer the game towards a simplistic, shallow, or casual design. Sigh.
Helping to make Spore was a blast, and I'm really proud of our work on the project. The number of problems we had to solve that cut across design, aesthetics, user interface, and technology was unprecedented in the history of game development, I think. This page is really long, but it only covers the stuff I touched, and so even in spite of its length, it represents a small subset of all the amazing problems the team as a whole solved. Keep an eye on Ocean's blog as he digs up more stuff from his development archive, and we're constantly releasing new stuff at spore.com as well.
And, of course, the best way to look into these technologies is to play Spore or download the free Creature Creator and make some creatures!
There have been zillions of amazing creatures created in Spore, and I've used many of them in my lectures around the world, but here is my all-time favorite creature, made very early on by a contributor to the Something Awful Forums.
When you're making games, and struggling with technology and design and people and all the rest, it's things like this that make it all worth while.
- ↑ An online copy of the Spore credits, presumably taken from the game, are here.
- ↑ I have much more to write about this topic in the future. It's a controversial topic even amongst developers. I find it interesting to contrast the way we as an industry handle game credits with the ways other art forms handle credits, but more on that later.
- ↑ I hate software patents. They stifle innovation and waste work.
- ↑ My favorite map projection is the Pierce Quincuncial Projection; it's beautiful aesthetically and mathematically.
- ↑ I stole this image from this page, not sure where they got it from, but it's a perfect example, so thanks to them for finding it!
- ↑ For example, see Least squares conformal maps for automatic texture atlas generation.
- ↑ Henry ended up disabling the skirts for performance reasons in the final shipping version, and simply dilating the texture. You can see the seams in the editor when the paint scripts are running, but then they disappear when the painting has finished.