<< Newer Articles under MAME Older >>

About Laserdiscs, part 1

Laserdiscs come in two varieties: CAV (constant angular velocity) and CLV (constant linear velocity).

As its initials imply, a CAV laserdisc spins at the same speed no matter where on the disc it is being read from. Since laserdiscs play back at a fixed data rate, this implies that data is packed more tightly near the center of the disc than it is toward the outer edges.

If you remember your basic geometry, the circumference of a circle is directly proportional to the radius (C = K × r). Let's say the disc spins just fast enough so that one rotation holds one video frame. So if you encode one frames' worth of data at r=5, you have to pack it into a linear distance of 5K. If you encode the same amount of data at r=8, you have much more room (8K) to store it in.

In contrast, a CLV laserdisc packs its data at the same density regardless of its location on this disc. This means that in order to read it, a laserdisc player must adjust the rotation speed depending on how far away from the center of the disc it is reading. Compact discs are CLV devices and you probably recognize the disc speed changing as you seek back and forth.

Back to the geometry, in CLV discs data is read at a constant rate (R) per unit of circumference (rate = R × C). Since the circumference is proportional to the radius, this makes the data rate proportional to the radius as well (rate = R × (K × r) = RK × r). This means that at r=5, the disc stores 5RK units of data, and at r=8, the disc stores 8RK units of data.

It's pretty clear that CLV discs pack the data more efficiently than CAV discs. This is because with CLV discs, you can pack the data at the maximum rate the player is capable of reading consistently across the entire disc. Whereas with CAV discs, the data can be efficiently packed at the inner part of the disc, but as you move farther away from the center, it gets increasingly less densly packed. In reality, CAV discs maxed out at about 30 minutes (54,000 frames) of playback per side, while CLV discs generally got about 60 minutes per side.

Given these facts, why on earth would you ever create a CAV disc, when a CLV disc allows you to pack the data more efficiently? Well, because the simplicity of finding information on a CAV disc enabled many special features: still frames, reverse play, slow motion, and — most importantly for laserdisc-based video games — direct access to any frame on the disc by index.

With a CAV disc, each video frame happens to corresponds to one rotation of the disc. Thus, if you want to advance 100 frames ahead, you simply moved the laser up the appropriate amount and read the data there. On the other hand, with a CLV disc, advancing 100 frames involves knowing how fast the data is coming and computing where to seek and how much to adjust the rotation speed to find the target, a much more complicated maneuver.

When it comes to movies, most laserdiscs were produced as CLV, to minimize the number of times you need to flip or change discs. A few special edition and high-end versions of movies were released as CAV, enabling access to nice still frames and other effects.

When it comes to videogames, however, it's all CAV. In the next article, I'll ignore CLV discs entirely, and talk about tracks, frames, and VBI data on CAV discs.

California Extreme 2008 Musings

This was my 9th CAX (8th in a row) — hard to believe!

My talk went pretty well and was surprisingly well-attended for something that started at 7:45pm. Thanks to everyone who showed up. It was nice to really have a chance to tell folks in person exactly what motivates people like me to work on MAME, and also to clear up some misconceptions people have about the project. Hopefully there will be some means for everyone to see it eventually.

Got the chance to have dinner afterwards with R. Belmont, Tim Lindner, and Mark G. I was so wired after talking that it was really nice to sit down and talk shop over some beers and pizza.

If you ever go to the show, make sure you stick around until it closes on Saturday night. People start clearing out in the early evening, and by 10-11 you can pretty much play anything you want without a wait. Sundays are also less crowded, though games start disappearing late Sunday afternoon as people who have a lot of games to move like to get a jump on getting it all packed up.

Loved meeting up with so many people this time. I usually keep a low profile, but because I had a talk on Saturday, it was hard to avoid people knowing who I was.

Thanks to Steve for coming down, playing some games, and driving us to In-n-Out for lunch!

Backbone Entertainment was there showing off their Capcom HD remixes. I have to say they looked quite impressive, especially the 1942 game with proper 3d terrain underneath and impressive shader effects (though to be fair, it was more Raiden than 1942, but as a Raiden fan I can forgive that). If I owned a NG console, I'd probably pick that one up!

Yes, your monitor can be too big. They had a Raiden Fighters there that was just unplayable because the giant monitor meant that you could not rely on your peripheral vision to see what was going on enough to survive.

Got the Mappy high score (~145,000) during the first half hour of play on Saturday, fully expecting it to be bested by later in the day. But the coin door was locked, and it wasn't on free play, so the credits ran out and the score stuck around all day. Eventually someone turned it off. Woohoo, I am teh Mappy champion!

Got to play a one-of-a-kind driving game prototype made by Phantom Systems. It was a hoot. Really cheesy early 3D polygons, terrible physics (100% friction means no sliding), tons of Z fighting. Definitely a "so bad it's good" moment. Maybe we can track down the owners and get it in MAME someday for others to enjoy.

There was a really nice mix of games this year. It definitely felt different from the last couple, where it seemed like the same folks brought the same games. Of course, some of my favorites were missing (Moon Patrol, Elevator Action), but there were some new ones to play, like Jack the Giantkiller, Rescue, Space Lords, Xybots, Arm Wrestling, and NARC.

There were many Asteroids machines, but they all seemed to be playing freakin' Asteroids Deluxe (which I dislike). Eventually, I realized that at least a couple of them had a multigame installed. Once that was sorted I was able to play the original in all its glory.

Panic Park was there again. What a great game. Somebody has to put that in MAME, though it won't be nearly the same without the controls.

Looping and Up n' Down win this year's awards for Can't Even Get Past the First Level, joining Defender and Super Zaxxon. I'm not much of a game player, but come on, usually the first level is doable after a few tries in most games.

Once again I reaffirmed that I completely suck at pinball, though there was one there that was being really nice to me (and not so much to my friend Steve, whom I think I beat 36 million to 2). Wish I could remember which one it was (maybe The Shadow). Also saw an awesome old school Ted Nugent pinball, which ironically seemed to make the girliest sounds. Cracked me up while playing.

Lesson learned: never, ever forget to bring your universal hard disk-to-USB converter again. *sigh*

DirectShow, part 2

A while back, when the initial push to start adding laserdisc support to MAME was happening, I discovered that there was something missing in all existing laserdisc captures. If you know anything about the NTSC broadcast standard, you know that a video signal consists of 525 scanlines spread across a 30Hz (actually 29.97Hz) interval. Of those 525 scanlines, 480 are considered to be "visible" scanlines, while the remaining 45 constitute the vertical blanking interval (VBI).

At first, the VBI part of the scan just contained a couple of special signals for synchronization, but over time, people started adding data to them. The most famous example is closed captioning, which is usually encoded as binary data on line 21 of each video field (a single 30Hz frame is sent as two 60Hz fields, interlaced).

Getting back to DirectShow, the way this is handled for video capturing is that the video capture component has a VBI output, which is the raw VBI data, usually sampled at a high rate (2x or 3x the video sampling rate) for improved accuracy. As with everything in DirectShow, there are filters that you can attech the VBI output to, which will decode the line 21 data into binary data, and then further filters that will render the captions as overlays on top of the video stream.

The thing I discovered about laserdiscs, though, is that they encoded several very important bits of metadata on some of the VBI lines. Specifically, line 12 has what is known as a "white flag", line 16 contains some control bits, and lines 17-18 contain information about the current frame number and chapter. Even more importantly, all of this metadata is crucial to the way laserdiscs are controlled and operated in video games.

The problem is, how do you capture that data? Well, you could route the raw VBI data to a file and then attempt to sync it up with your video capture. But an easier way to deal with it would be to simply take the VBI data, reconstitute it as video signal, and render it above the visible video signal, to essentially reproduce a capture of the entire video signal — all 525 lines of it.

Sadly, such a filter didn't exist (or at least I was unable to find such a thing). But thankfully, the DirectShow architecture makes it a fairly simple matter to write your own, which turned into a little side project that I have been working on. Although not quite ready for prime time (synchronization is still a little tricky), the basic filter is now up and running, and we have some actual demo captures with the laserdisc VBI data intact:

(In the current version, I extract the laserdisc metadata and overlay it on the video as well, to see if the information matches up. The frame number that the laserdisc player decoded is in large text at the top, and the information my filter has decoded is displayed on the bottom, with field 1 info on the left and field 2 info on the right.)

In the end, I'll post the source and binary versions of this filter so that it will be possible to get consistent laserdisc captures complete with metadata (on Windows only, though similar techniques can probably be applied to other platforms). And yes, with this problem solved, they should be starting to happen in the coming months.

DirectShow

Had my first real exposure to DirectShow recently and I've come away pretty impressed. For those who don't know, DirectShow is an architecture inside Windows that is used for streaming and capturing media (audio/video). It works through a series of independent components, each of which has some number of inputs and outputs. These components can then be assembled as appropriate to route data through the system.

(Note that I believe there is a successor to DirectShow, but since I'm an old-skool kind of person and wanted compatibility with other DirectShow components, I went that way.)

Components fall into three categories: sources, which only have outputs and provide source data; renderers, which only have inputs and "render" the final data; and filters, which have both inputs and outputs and transform the data in some way.

Now, this might not seem all that impressive until you realize that the process of connecting components and building up a data flow has been completely abstracted. The key to this is the Microsoft-provided utility GraphEdit. What this simple utility does it allow you to add a bunch of filters and then interconnect them. You then hit "play" and the filters are activated. Assuming you have assembled the filters and connections between them properly, data flows from the source through the filters and to the renderer(s).

As a super-simple example, you can take a still image (say, a JPG file) and make that your "source" via the Generate Still Video component. You can then route the output of that to a Color Space Converter filter which applys color transformations to the image. And then you can route the output of the filter to the "Video Renderer", which renders any video inputs to the screen in a window.

Once you've built this graph, you simply hit "play" and up pops a window with the JPG image.

Ok, that was super-simple, but you start to get the idea. The next step is to try changing the renderer to something else. Let's say we want to output an AVI file of this still image. We can do that by first connecting the output of the Generate Still Video source to one of the inputs of an AVI Mux filter (this filter accepts multiple inputs — usually a video stream and an audio stream — and interleaves them into a single AVI data stream). We then connect the output of the AVI Mux filter to a "File Writer" renderer, which lets us route the raw output of any filter to a file.

Now when we hit "play", the still video source is converted to an AVI file.

Let's say we want to both view the output AND generate an AVI file. No problem. We insert a new Smart Tee filter, which has one input and two outputs, each of which contains a copy of the input. Then we wire one output through the Color Space Converter and out to the Video Renderer to see what we are viewing, and we wire the second output through the AVI Mux and out to the File Writer.

Hitting "play" now does both: an image and an AVI file.

Now for the main event. Let's swap out the source with something more interesting. Say, the video capture component provided by your video capture card (each card is different, so I can't provide the exact details here, but you need to dig through all the filters to find them). Usually there are two pieces: a Crossbar source component, and a Capture filter component. Set up the Crossbar to output the appropriate source from your capture device, route that through the Capture component, and then route the outputs from there into the Smart Tee, which is still outputting to the screen and an AVI file.

Hit "play" and now with nothing more than this GraphEdit tool, you have a video capture solution. Granted, it's a bit bare bones (some components have property panels you can get at via a right-click), but it works. And the ability to wire up the filters comes in handy in a major way, as I'll explain in a follow-up article.

(Yes, this post is in the right category :)

Month of Bugfixes Recap

Well, the last few days didn't generate any qualifying bugfixes, so the final donation is $1,080. PayPal sent!

Thanks again to everyone who contributed! There were a lot of great contributions from outside the core team, plus the return of a few old-timers for some guest bug fixes, and good effort all around.

One interesting side-effect is that the awareness of MAMETesters went up, which is a good thing. But it also means that a lot more bugs were filed this month, so the net effect on the total numbers is probably not quite as positive as I might have imagined.

Would I do it again? Maybe, but not very soon. It definitely increased awareness, but it's also clearly difficult to keep momentum going for a whole month. Perhaps I'll try something a little different next time, who knows?