Blog

RSS

Ludum Dare #24 Game Jam

August 24th, 2012

A few days ago I heard from a friend about a competition for crazy people. Its creators would have you, given nothing but 48 hours and a theme, make a complete game. Naturally, this sounded like a pretty worthy use of 48 hours to me, so this is what I am doing as of about an hour ago.

The required theme this weekend is "evolution". I was got pretty attached to the first idea that popped into my head, which was a shmup where the enemies come in waves, each with a variety of features like armor and shields and unique guns and stats like attack speed and power. Each successive wave would be born from the most successful enemies of the previous wave - the enemies gradually becoming more and more deadly. It took some doing, but I managed to convince myself to think through some other ideas. Here I am about a (rather emotionally trying) hour later, with a different, rather simpler idea that's going to look more like the puzzle game offspring of Rock-Paper-Scissors and Conway's Game of Life. It would take far too long to explain it now, and I'll probably change it all anyway, but that's where it stands. More as it happens!

http://www.ludumdare.com/compo/


Permalink



Dynamically Lighting 2D Scenes with Normal Maps

April 13th, 2012

My random inspiration of this week occurred when one of my professors, Dan Frost, mentioned basic lighting techniques in lecture. It turns out that for any surface, the final color of that surface with a light on it can be calculated quite simply by multiplying the surface's color by the color of the light times the dot product of the surface normal and the normalized vector from the light to the surface. I thought to myself, "I could totally do that." Any about an hour later, I had this demo up and running:

Normal mapping has been traditionally used in 3D games on surface textures for 3D models. In this context, its use can easily make an extremely low-poly object appear much more detailed at a low processing cost by rendering textures with lighting such that they appear to "pop out". In this demo, I have instead used it to fake a 3D effect on a completely flat 2D plane.

The process is fairly straightforward. First, the standard scene is rendered to a render target cleared black, with nothing special done. Then, the scene is rendered a second time to a second render target, but this time the normal maps for each image, if they have one, are drawn instead of the standard image. Normal maps encode information about the surface normal, or the direction the surface faces, of each pixel in an image. In most implementations, the X, Y, and Z components of the normal vector are encoded as the R, G, and B components of the color of a pixel on an image. This can be seen toward the end of the above video. This second render target is cleared RGB(128, 128, 255) so that images that do not have normal maps can be rendered as flat planes. Finally, both of these renders are passed to a shader that calculates the final color of each pixel in the first image using information about the locations and colors of all the lights in the scene, and the pixel's normal from the second image.

I plan to post the code from this demo soon, after I get my work on ambient light, directional light, and multiple point lights in there, so look out for it!


Permalink

Making Music Games (a programmer's postmortem)

March 20th, 2012

Being a rhythm game, Music Island presented a number of unique and fun challeges from a programming perspective. The great difficulty in programming rhythm-based games is in keeping gameplay synchronized with the game's music. A fully-featured triple-A engine like BASS might be able to analyze the music in real-time and use it to guide gameplay. However, for our one-programmer team, analyzing the stream in real-time was not really an option. We would have to find some way to overlay music onto the game and force the game to keep pace with it.

The concept was fairly simple. We would use MP3 music. With the length of the song, the number of measures, and the time signature, I can calculate the time in milliseconds between each beat - or any subdivision of the beat. When the song is running, the program keeps track of which beat it is on simply by counting up with a Stopwatch. When the time exceeds the amount of time per beat, we go to the next one (carrying over the leftover time).

To have the game actually use this information, I determine what kind of beat the song was currently on (quarters, eighths, thirty-seconds, etc) and the distance in thirty-second notes from the nearest quarter note. If that value is within a threshold, I accept input and generate a spell.

I found that MP3 files were not actually a sufficient solution, for our resources at least (the XNA engine). Despite our best efforts to get accurate timings, I could not make the game stay in sync with the music. Within 5 or 10 seconds, they would be on off-beats, and XNA offered me no way to control how the MP3 was being played to attempt to correct this. I concluded that XNA's MediaPlayer class, which was playing the MP3s, was probably the issue, so our solution was to split the songs up into segments of WAVs. In addition, our songs were designed and composed such that they could be split up into layers by instrument. So we ended up with a bunch of two-measure wave files of single-instrument lines. I wrote an XML format so our composer could stack and sequence these segments however he liked. This helped cut down on the otherwise-huge filesize wav files would have caused by re-using tunes multiple times in a song.

Now with these segmented WAVs, we had far more control over the speed the music was played at, and we were able to synchronize the music periodically. This was ultimately sufficient to keep the game as a whole completely in sync. It was a feature we were still tweaking, though, even after the game's first submission!


Permalink


Previous Page
88 posts — page 12 of 18
Next Page