Articles posted July 2008

California Extreme 2008 Musings

This was my 9th CAX (8th in a row) — hard to believe!

My talk went pretty well and was surprisingly well-attended for something that started at 7:45pm. Thanks to everyone who showed up. It was nice to really have a chance to tell folks in person exactly what motivates people like me to work on MAME, and also to clear up some misconceptions people have about the project. Hopefully there will be some means for everyone to see it eventually.

Got the chance to have dinner afterwards with R. Belmont, Tim Lindner, and Mark G. I was so wired after talking that it was really nice to sit down and talk shop over some beers and pizza.

If you ever go to the show, make sure you stick around until it closes on Saturday night. People start clearing out in the early evening, and by 10-11 you can pretty much play anything you want without a wait. Sundays are also less crowded, though games start disappearing late Sunday afternoon as people who have a lot of games to move like to get a jump on getting it all packed up.

Loved meeting up with so many people this time. I usually keep a low profile, but because I had a talk on Saturday, it was hard to avoid people knowing who I was.

Thanks to Steve for coming down, playing some games, and driving us to In-n-Out for lunch!

Backbone Entertainment was there showing off their Capcom HD remixes. I have to say they looked quite impressive, especially the 1942 game with proper 3d terrain underneath and impressive shader effects (though to be fair, it was more Raiden than 1942, but as a Raiden fan I can forgive that). If I owned a NG console, I'd probably pick that one up!

Yes, your monitor can be too big. They had a Raiden Fighters there that was just unplayable because the giant monitor meant that you could not rely on your peripheral vision to see what was going on enough to survive.

Got the Mappy high score (~145,000) during the first half hour of play on Saturday, fully expecting it to be bested by later in the day. But the coin door was locked, and it wasn't on free play, so the credits ran out and the score stuck around all day. Eventually someone turned it off. Woohoo, I am teh Mappy champion!

Got to play a one-of-a-kind driving game prototype made by Phantom Systems. It was a hoot. Really cheesy early 3D polygons, terrible physics (100% friction means no sliding), tons of Z fighting. Definitely a "so bad it's good" moment. Maybe we can track down the owners and get it in MAME someday for others to enjoy.

There was a really nice mix of games this year. It definitely felt different from the last couple, where it seemed like the same folks brought the same games. Of course, some of my favorites were missing (Moon Patrol, Elevator Action), but there were some new ones to play, like Jack the Giantkiller, Rescue, Space Lords, Xybots, Arm Wrestling, and NARC.

There were many Asteroids machines, but they all seemed to be playing freakin' Asteroids Deluxe (which I dislike). Eventually, I realized that at least a couple of them had a multigame installed. Once that was sorted I was able to play the original in all its glory.

Panic Park was there again. What a great game. Somebody has to put that in MAME, though it won't be nearly the same without the controls.

Looping and Up n' Down win this year's awards for Can't Even Get Past the First Level, joining Defender and Super Zaxxon. I'm not much of a game player, but come on, usually the first level is doable after a few tries in most games.

Once again I reaffirmed that I completely suck at pinball, though there was one there that was being really nice to me (and not so much to my friend Steve, whom I think I beat 36 million to 2). Wish I could remember which one it was (maybe The Shadow). Also saw an awesome old school Ted Nugent pinball, which ironically seemed to make the girliest sounds. Cracked me up while playing.

Lesson learned: never, ever forget to bring your universal hard disk-to-USB converter again. *sigh*

DirectShow, part 2

A while back, when the initial push to start adding laserdisc support to MAME was happening, I discovered that there was something missing in all existing laserdisc captures. If you know anything about the NTSC broadcast standard, you know that a video signal consists of 525 scanlines spread across a 30Hz (actually 29.97Hz) interval. Of those 525 scanlines, 480 are considered to be "visible" scanlines, while the remaining 45 constitute the vertical blanking interval (VBI).

At first, the VBI part of the scan just contained a couple of special signals for synchronization, but over time, people started adding data to them. The most famous example is closed captioning, which is usually encoded as binary data on line 21 of each video field (a single 30Hz frame is sent as two 60Hz fields, interlaced).

Getting back to DirectShow, the way this is handled for video capturing is that the video capture component has a VBI output, which is the raw VBI data, usually sampled at a high rate (2x or 3x the video sampling rate) for improved accuracy. As with everything in DirectShow, there are filters that you can attech the VBI output to, which will decode the line 21 data into binary data, and then further filters that will render the captions as overlays on top of the video stream.

The thing I discovered about laserdiscs, though, is that they encoded several very important bits of metadata on some of the VBI lines. Specifically, line 12 has what is known as a "white flag", line 16 contains some control bits, and lines 17-18 contain information about the current frame number and chapter. Even more importantly, all of this metadata is crucial to the way laserdiscs are controlled and operated in video games.

The problem is, how do you capture that data? Well, you could route the raw VBI data to a file and then attempt to sync it up with your video capture. But an easier way to deal with it would be to simply take the VBI data, reconstitute it as video signal, and render it above the visible video signal, to essentially reproduce a capture of the entire video signal — all 525 lines of it.

Sadly, such a filter didn't exist (or at least I was unable to find such a thing). But thankfully, the DirectShow architecture makes it a fairly simple matter to write your own, which turned into a little side project that I have been working on. Although not quite ready for prime time (synchronization is still a little tricky), the basic filter is now up and running, and we have some actual demo captures with the laserdisc VBI data intact:

(In the current version, I extract the laserdisc metadata and overlay it on the video as well, to see if the information matches up. The frame number that the laserdisc player decoded is in large text at the top, and the information my filter has decoded is displayed on the bottom, with field 1 info on the left and field 2 info on the right.)

In the end, I'll post the source and binary versions of this filter so that it will be possible to get consistent laserdisc captures complete with metadata (on Windows only, though similar techniques can probably be applied to other platforms). And yes, with this problem solved, they should be starting to happen in the coming months.


Had my first real exposure to DirectShow recently and I've come away pretty impressed. For those who don't know, DirectShow is an architecture inside Windows that is used for streaming and capturing media (audio/video). It works through a series of independent components, each of which has some number of inputs and outputs. These components can then be assembled as appropriate to route data through the system.

(Note that I believe there is a successor to DirectShow, but since I'm an old-skool kind of person and wanted compatibility with other DirectShow components, I went that way.)

Components fall into three categories: sources, which only have outputs and provide source data; renderers, which only have inputs and "render" the final data; and filters, which have both inputs and outputs and transform the data in some way.

Now, this might not seem all that impressive until you realize that the process of connecting components and building up a data flow has been completely abstracted. The key to this is the Microsoft-provided utility GraphEdit. What this simple utility does it allow you to add a bunch of filters and then interconnect them. You then hit "play" and the filters are activated. Assuming you have assembled the filters and connections between them properly, data flows from the source through the filters and to the renderer(s).

As a super-simple example, you can take a still image (say, a JPG file) and make that your "source" via the Generate Still Video component. You can then route the output of that to a Color Space Converter filter which applys color transformations to the image. And then you can route the output of the filter to the "Video Renderer", which renders any video inputs to the screen in a window.

Once you've built this graph, you simply hit "play" and up pops a window with the JPG image.

Ok, that was super-simple, but you start to get the idea. The next step is to try changing the renderer to something else. Let's say we want to output an AVI file of this still image. We can do that by first connecting the output of the Generate Still Video source to one of the inputs of an AVI Mux filter (this filter accepts multiple inputs — usually a video stream and an audio stream — and interleaves them into a single AVI data stream). We then connect the output of the AVI Mux filter to a "File Writer" renderer, which lets us route the raw output of any filter to a file.

Now when we hit "play", the still video source is converted to an AVI file.

Let's say we want to both view the output AND generate an AVI file. No problem. We insert a new Smart Tee filter, which has one input and two outputs, each of which contains a copy of the input. Then we wire one output through the Color Space Converter and out to the Video Renderer to see what we are viewing, and we wire the second output through the AVI Mux and out to the File Writer.

Hitting "play" now does both: an image and an AVI file.

Now for the main event. Let's swap out the source with something more interesting. Say, the video capture component provided by your video capture card (each card is different, so I can't provide the exact details here, but you need to dig through all the filters to find them). Usually there are two pieces: a Crossbar source component, and a Capture filter component. Set up the Crossbar to output the appropriate source from your capture device, route that through the Capture component, and then route the outputs from there into the Smart Tee, which is still outputting to the screen and an AVI file.

Hit "play" and now with nothing more than this GraphEdit tool, you have a video capture solution. Granted, it's a bit bare bones (some components have property panels you can get at via a right-click), but it works. And the ability to wire up the filters comes in handy in a major way, as I'll explain in a follow-up article.

(Yes, this post is in the right category :)

RSS is back

Just a quick note to point out that I have resurrected the RSS feeds. Required a little bit more PHP, but wasn't too daunting, and now you can once again subscribe to a whole feed or category-specific feeds using the links to the right. Alternately, I've added the proper tags so that IE7 and later pick up on the appropriate feed and "light up" to let you know there is something worth subscribing to.

More to come this coming week once the long holiday is through....