USB video frames -> C array on Snow Leopard: fastest path?

Member
Posts: 45
Joined: 2006.07
Post: #1
Hey all -

I'm looking for a fast way to get frames from a USB (or built-in iSight) into a C/C++ program. The video is going to be used to control a robot. By fast I mean low latency, as well as low learning curve. I acknowledge that these are often mutually exclusive goals Smile

I found this page on the Apple website which has sample code from a bunch of frameworks: http://developer.apple.com/mac/library/n...urce+Types

...but I've never actually touched Objective-C before in my life. Before I delve too deeply into sample code, does anyone have helpful pointers to offer? Am I opening up a giant can of worms here by poking around with the Image Capture/QuickTime stuff?

The alternative is to use OpenCV, which has its own image capture API but it's way more library than I need just to get at the bytes from the camera.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #2
Having used the QTKit and QuickTime APIs for the purpose I can honestly say they are both vile. If OpenCV reads direct from the USB driver it might well be better than either. We had latency problems with QTKit, though it's the "recommended" (as in, futureproof) API.
Quote this message in a reply
Member
Posts: 45
Joined: 2006.07
Post: #3
Well OpenCV fails to build both from MacPorts and from sourceforge. I found the StillFrames example code on developer.apple.com which seems to do pretty much what I want. So QTKit, here I come.
Quote this message in a reply
Member
Posts: 45
Joined: 2006.07
Post: #4
Ok, I got an implementation running. Started out by making a very minimal C wrapper around QTKit and especially QTCaptureDecompressedVideoOutput (header file below). The idea is to pick a video device, set a callback, and start up the device calling the callback.

I embedded all of this inside of a Qt4 application and it works! My callback is getting called and the bytes make totally reasonable images. Super.

The only problem is that this seems to need a NSApplication to pump the event loop under the hood, so I can't run it from a console application right now.

Does anyone have any example code that creates a minimal NSApplication and pumps its event loop so that I can do this in a non-GUI application?

-m

Here is my video_capture.h:
Code:
typedef struct video_capture* vcapture;

enum vcapture_format_type {
  VC_UNKNOWN,
  VC_PACKED_RGB,
  VC_PACKED_ARGB,
};

enum vcapture_error {
  VC_SUCCESS = 0,
  VC_INVALID_DEVICE = -1,
  VC_DEVICE_OPEN_ERROR = -2,
  VC_NO_ACTIVE_DEVICE = -3,
};

// zero means unspecified
// only type, width, and height are observed in set_format
struct vcapture_frame_format {
  enum vcapture_format_type type;
  int width;
  int height;
  int pad_left;
  int pad_top;
  int pad_right;
  int pad_bottom;
  int bytes_per_row;
};

typedef void (*vcapture_frame_callback)(const struct vcapture_frame_format* format,
                                        const void* bytes,
                                        unsigned long long int timestamp,
                                        void* userdata);


    
#ifdef __cplusplus
extern "C" {
#endif
  
  vcapture vcapture_alloc();
  void vcapture_free(vcapture v);

  int vcapture_device_count(vcapture v);
  const char* vcapture_device_name(vcapture v, int device);

  int vcapture_set_device(vcapture v, int device);
  int vcapture_get_device(vcapture v);

  int vcapture_get_format(vcapture v, struct vcapture_frame_format* f);
  int vcapture_set_format(vcapture v, const struct vcapture_frame_format* f);

  int vcapture_active(vcapture v);
  int vcapture_start(vcapture v);
  int vcapture_stop(vcapture v);

  vcapture_frame_callback vcapture_get_callback(vcapture v);

  int vcapture_set_callback(vcapture v,
                            vcapture_frame_callback callback,
                            void* userdata);


#ifdef __cplusplus
}
#endif
Quote this message in a reply
Member
Posts: 45
Joined: 2006.07
Post: #5
Replying to my own post... I made a call just to advance the run loop, that seemed to allow the QTCaptureSession to do its thing.

Code:
void vcapture_poll(int msec) {
  NSTimeInterval interval = msec / 1000.0;
  NSDate* date = [[NSDate alloc] initWithTimeIntervalSinceNow:interval];
  [[NSRunLoop currentRunLoop]  runMode:NSDefaultRunLoopMode beforeDate:date];
  [date release];
}

I now have a camera doing stuff from a command line app. Yay!

Soon I'll post the full code for the C wrapper -- I'd be interested to see if people can identify any obvious bugs in my Objective C.
Quote this message in a reply
Member
Posts: 45
Joined: 2006.07
Post: #6
As threatened, here are three source files - the C include file, the Objective C implementation file, and a simple C++ test program that just dumps a single PNG image from the first camera the program finds.

If anyone has time to look at the Objective C code and tell me if they see any glaring errors, I'd be thrilled.

video_capture.h:
Code:
#ifndef _VIDEO_CAPTURE_H_
#define _VIDEO_CAPTURE_H_

#define VC_MAX_PLANES 4

typedef struct video_capture* vcapture;

enum vcapture_format_type {
  VC_TYPE_UNKNOWN,
  VC_TYPE_RGB,         // packed: bytes = R0, G0, B0, ...
  VC_TYPE_ARGB,        // packed: bytes = A0, R0, G0, B0, ...
  VC_TYPE_RGBA,        // packed: bytes = R0, G0, B0, A0, ...
  VC_TYPE_FOURCC_2uvy, // packed: bytes = Cb, Y0, Cr, Y1, ... pixels 0 & 1 share Cb, Cr
  VC_TYPE_FOURCC_yuvs, // packed: bytes = Y0, Cb, Y1, Cr, ... pixels 0 & 1 share Cb, Cr
  VC_TYPE_FOURCC_YVYU, // packed: bytes = Y1, Cr, Y0, Cb, ... pixels 0 & 1 share Cb, Cr
  VC_TYPE_FOURCC_yuvu, // packed: bytes = Y0, Cb, Y1, Cr, ... pixels 0 & 1 share Cb, Cr
};

enum vcapture_error {
  VC_SUCCESS = 0,
  VC_INVALID_DEVICE = -1,
  VC_DEVICE_OPEN_ERROR = -2,
  VC_NO_ACTIVE_DEVICE = -3,
  VC_DEVICE_RUNNING = -4
};

struct vcapture_frame {
  int count; // 0 if component/packed video, otherwise number of planes
  void* planes[VC_MAX_PLANES];
};


// zero means unspecified
// only type, fourcc, width, and height are observed in set_format
// if type and fourcc are both set, fourcc wins
struct vcapture_frame_format {
  enum vcapture_format_type type;
  unsigned int fourcc;
  int width;
  int height;
  int bytes_per_row;
  int pad_left;
  int pad_top;
  int pad_right;
  int pad_bottom;
};

typedef void (*vcapture_frame_callback)(const struct vcapture_frame_format* format,
                                        const struct vcapture_frame* frame,
                                        unsigned long long int timestamp,
                                        void* userdata);


    
#ifdef __cplusplus
extern "C" {
#endif

unsigned int vcapture_type_to_fourcc(enum vcapture_format_type type);
enum vcapture_format_type vcapture_fourcc_to_type(unsigned int fourcc);

void vcapture_poll(int msec);

vcapture vcapture_alloc();
void vcapture_free(vcapture v);

int vcapture_device_count(vcapture v);
const char* vcapture_device_name(vcapture v, int device);

int vcapture_get_device(vcapture v);
int vcapture_set_device(vcapture v, int device);

int vcapture_get_requested_format(vcapture v, struct vcapture_frame_format* f);
int vcapture_set_requested_format(vcapture v, const struct vcapture_frame_format* f);

vcapture_frame_callback vcapture_get_callback(vcapture v);
int vcapture_set_callback(vcapture v,
                          vcapture_frame_callback callback,
                          void* userdata);

int vcapture_active(vcapture v);
int vcapture_start(vcapture v);
int vcapture_stop(vcapture v);

#ifdef __cplusplus
}
#endif

#endif

video_capture.m
Code:
#import <QTKit/QTKit.h>
#include "video_capture.h"

#define DebugLog if (0) NSLog

// -*- mode: objc -*-

@interface VideoCapture: NSObject
{
  QTCaptureSession*                 captureSession;
  NSMutableArray*                   videoDevices;
  QTCaptureDeviceInput*             deviceInput;
  QTCaptureDecompressedVideoOutput* videoOutput;

  vcapture_frame_callback callback;
  void* userdata;
  
  struct vcapture_frame_format requested;
  struct vcapture_frame_format actual;
  BOOL needActual;

}

- (NSArray*) devices;
- (int) device;
- (int) setDevice: (int)idx;

- (BOOL) active;
- (int) start;
- (int) stop;

- (vcapture_frame_callback) callback;
- (void) setCallback: (vcapture_frame_callback)callback withUserData:(void*)userdata;

- (void) getRequestedFormat:(struct vcapture_frame_format*)format;
- (int)  setRequestedFormat:(const struct vcapture_frame_format*)format;

- (void) captureOutput:(QTCaptureOutput *)captureOutput
         didOutputVideoFrame:(CVImageBufferRef)videoFrame
         withSampleBuffer:(QTSampleBuffer *)sampleBuffer
         fromConnection:(QTCaptureConnection *)connection;

@end

@implementation VideoCapture

- (id) init {
  if ( (self = [super init]) ) {

    NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];

    captureSession = [[QTCaptureSession alloc] init];
    videoDevices = [[NSMutableArray alloc] init];
    deviceInput = nil;
    videoOutput = nil;
    callback = 0;
    userdata = 0;

    memset(&requested, 0, sizeof(struct vcapture_frame_format));
    memset(&actual, 0, sizeof(struct vcapture_frame_format));

    requested.type = VC_TYPE_RGB;
    needActual = YES;

    NSArray* allInputDevices = [QTCaptureDevice inputDevices];

    for (QTCaptureDevice* device in allInputDevices) {
      DebugLog(@"device: %@", device);
      if ([device hasMediaType:QTMediaTypeVideo] ||
          [device hasMediaType:QTMediaTypeMuxed]) {
        DebugLog(@"  it is video!");
        [videoDevices addObject:device];
      }
    }

    [pool release];

  }
  return self;
}

- (void) resetDevice {

  if ([self active]) {
    [self stop];
  }

  if (videoOutput) {
    [captureSession removeOutput:videoOutput];
    [videoOutput release];
    videoOutput = nil;
  }

  if (deviceInput) {
    [captureSession removeInput:deviceInput];
    QTCaptureDevice* device = [deviceInput device];
    if ([device isOpen]) {
      [device close];
    }
    [deviceInput release];
    deviceInput = nil;
  }

}

- (void) dealloc {
  DebugLog(@"deallocating a %@", [self className]);
  if (deviceInput) {
    if ([self active]) {
      DebugLog(@"stopping device in dealloc");
    }
    DebugLog(@"resetting device in dealloc");
  }
  [self resetDevice];
  [videoDevices release];
  [captureSession release];
  [super dealloc];
}

- (int) device {
  if (deviceInput == nil) {
    return -1;
  } else {
    return [videoDevices indexOfObject:[deviceInput device]];
  }
}

- (int) setDevice: (int)idx {

  if ([self device] == idx) {
    return VC_SUCCESS;
  }

  [self resetDevice];

  if (idx < 0) { return VC_SUCCESS; }
  if (idx >= [videoDevices count]) { return VC_INVALID_DEVICE; }


  QTCaptureDevice* device = [videoDevices objectAtIndex:idx];
  NSError* error = nil;

  BOOL success = [device open:&error];

  if (!success) {
    NSLog(@"Error opening device: %@", error);
    return VC_DEVICE_OPEN_ERROR;
  }

  deviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];

  success = [captureSession addInput:deviceInput error:&error];
  if (!success) {
    NSLog(@"Error adding input: %@", error);
    [self resetDevice];
    return VC_DEVICE_OPEN_ERROR;
  }

  videoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
  success = [captureSession addOutput:videoOutput error:&error];
  if (!success) {
    NSLog(@"Error adding output: %@", error);
    [self resetDevice];
    return VC_DEVICE_OPEN_ERROR;
  }

  [videoOutput setDelegate:self];
  [videoOutput setAutomaticallyDropsLateVideoFrames:YES];

  needActual = YES;


  NSMutableDictionary* pixelBufferAttributes = [[NSMutableDictionary alloc] init];
  NSString* key;
  NSNumber* val;

  
  if (requested.fourcc || requested.type) {

    unsigned int type;
    if (requested.fourcc) {
      type = requested.fourcc;
    } else {
      type = vcapture_type_to_fourcc(requested.type);
    }

    key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
    val = [NSNumber numberWithUnsignedInt:type];

    DebugLog(@"set pixelBufferAttributes %@->%@", key, val);

    [pixelBufferAttributes setObject:val forKey:key];

  }

  if (requested.width && requested.height) {

    key = (NSString*)kCVPixelBufferWidthKey;
    val = [NSNumber numberWithInt:requested.width];
    [pixelBufferAttributes setObject:val forKey:key];
    DebugLog(@"set pixelBufferAttributes %@->%@", key, val);

    key = (NSString*)kCVPixelBufferHeightKey;
    val = [NSNumber numberWithInt:requested.height];
    [pixelBufferAttributes setObject:val forKey:key];
    DebugLog(@"set pixelBufferAttributes %@->%@", key, val);

  }

  if ([pixelBufferAttributes count]) {
    [videoOutput setPixelBufferAttributes:pixelBufferAttributes];
  } else {
    [pixelBufferAttributes release];
  }
  
  return VC_SUCCESS;

}

- (NSArray*) devices {
  return videoDevices;
}

- (vcapture_frame_callback) callback {
  return callback;
}

- (void)captureOutput:(QTCaptureOutput *)captureOutput
  didOutputVideoFrame:(CVImageBufferRef)videoFrame
  withSampleBuffer:(QTSampleBuffer *)sampleBuffer
  fromConnection:(QTCaptureConnection *)connection

{

  [sampleBuffer incrementSampleUseCount];

  CVReturn status = CVPixelBufferLockBaseAddress(videoFrame, 0);

  if (status == 0) {

    if (needActual) {

      needActual = NO;

      actual.fourcc = CVPixelBufferGetPixelFormatType(videoFrame);
      actual.type = vcapture_fourcc_to_type(actual.fourcc);
      actual.width = CVPixelBufferGetWidth(videoFrame);
      actual.height = CVPixelBufferGetHeight(videoFrame);

      size_t l, r, t, b;

      CVPixelBufferGetExtendedPixels(videoFrame, &l, &r, &t, &b);

      actual.pad_left = l;
      actual.pad_right = r;
      actual.pad_top = t;
      actual.pad_bottom = b;

      actual.bytes_per_row = CVPixelBufferGetBytesPerRow(videoFrame);
    
    }

    struct vcapture_frame frame;

    memset(frame.planes, 0, sizeof(frame.planes));

    unsigned long long timestamp = 0;


    if (CVPixelBufferIsPlanar(videoFrame)) {
      frame.count = CVPixelBufferGetPlaneCount(videoFrame);
      if (frame.count > 4) { frame.count = 4; }
      int i;
      for (i=0; i<frame.count; ++i) {
        frame.planes[i] = CVPixelBufferGetBaseAddressOfPlane(videoFrame, i);
      }
    } else {
      frame.count = 0;
      frame.planes[0] = CVPixelBufferGetBaseAddress(videoFrame);    
    }
  
    callback(&actual, &frame, timestamp, userdata);
  
    CVPixelBufferUnlockBaseAddress(videoFrame, 0);

  }

  [sampleBuffer decrementSampleUseCount];

}

- (BOOL) active {
  return [captureSession isRunning];
}

- (int) start {
  if ([self device] < 0) { return VC_NO_ACTIVE_DEVICE; }
  [captureSession startRunning];
  return VC_SUCCESS;
}

- (int) stop {
  if ([captureSession isRunning]) {
    [captureSession stopRunning];
  }
  return VC_SUCCESS;
}

- (void) setCallback: (vcapture_frame_callback)c withUserData:(void*)u {
  callback = c;
  userdata = u;
}

- (void) getRequestedFormat:(struct vcapture_frame_format*)format {
  memcpy(format, &requested, sizeof(struct vcapture_frame_format));
}

- (int) setRequestedFormat:(const struct vcapture_frame_format*)format {
  if ([self active]) {
    return VC_DEVICE_RUNNING;
  }
  int oldDevice = [self device];
  [self resetDevice];
  memcpy(&requested, format, sizeof(struct vcapture_frame_format));
  return [self setDevice:oldDevice];
}

@end

//////////////////////////////////////////////////////////////////////

unsigned int vcapture_type_to_fourcc(enum vcapture_format_type t) {
  switch (t) {
  case VC_TYPE_RGB: return k24RGBPixelFormat;
  case VC_TYPE_RGBA: return kCVPixelFormatType_32RGBA;
  case VC_TYPE_ARGB: return kCVPixelFormatType_32ARGB;
  case VC_TYPE_FOURCC_2uvy: return '2uvy';
  case VC_TYPE_FOURCC_yuvs: return 'yuvs';
  case VC_TYPE_FOURCC_YVYU: return 'YVYU';
  case VC_TYPE_FOURCC_yuvu: return 'yuvu';
  default:
    break;
  }
  return 0;
}

enum vcapture_format_type vcapture_fourcc_to_type(unsigned int fourcc) {
  switch (fourcc) {
  case k24RGBPixelFormat: return VC_TYPE_RGB;
  case kCVPixelFormatType_32ARGB: return VC_TYPE_ARGB;
  case kCVPixelFormatType_32RGBA: return VC_TYPE_RGBA;
  case '2uvy': return VC_TYPE_FOURCC_2uvy;
  case 'yuvs': return VC_TYPE_FOURCC_yuvs;
  case 'YVYU': return VC_TYPE_FOURCC_YVYU;
  case 'yuvu': return VC_TYPE_FOURCC_yuvu;
  default:
    break;
  }
  return VC_TYPE_UNKNOWN;
}

//////////////////////////////////////////////////////////////////////

void vcapture_poll(int msec) {
  NSTimeInterval interval = msec / 1000.0;
  NSDate* date = [[NSDate alloc] initWithTimeIntervalSinceNow:interval];
  [[NSRunLoop currentRunLoop]  runMode:NSDefaultRunLoopMode beforeDate:date];
  [date release];
}

//////////////////////////////////////////////////////////////////////

struct video_capture {
  NSAutoreleasePool* pool;
  VideoCapture* instance;
};

//////////////////////////////////////////////////////////////////////

vcapture vcapture_alloc() {
  vcapture rval = (vcapture)malloc(sizeof(struct video_capture));
  if (!rval) { return 0; }
  rval->pool = [[NSAutoreleasePool alloc] init];
  rval->instance = [[VideoCapture alloc] init];
  if (!rval->instance) {
    vcapture_free(rval);
    rval = 0;
  }
  return rval;
}

void vcapture_free(vcapture v) {
  [v->instance release];
  [v->pool release];
  free(v);
}

//////////////////////////////////////////////////////////////////////

int vcapture_device_count(vcapture v) {
  return [[v->instance devices] count];
}

const char* vcapture_device_name(vcapture v, int idx) {

  NSArray* devices = [v->instance devices];
  if (idx >= [devices count]) {
    return 0;
  }

  QTCaptureDevice* device = [devices objectAtIndex:idx];
  NSString* str = [device description];

  if (!str) { return 0; }

  return [str cStringUsingEncoding:NSUTF8StringEncoding];

}

//////////////////////////////////////////////////////////////////////

int vcapture_get_device(vcapture v) {
  return [v->instance device];
}

int vcapture_set_device(vcapture v, int device) {
  return [v->instance setDevice:device];
}

//////////////////////////////////////////////////////////////////////

int vcapture_get_requested_format(vcapture v, struct vcapture_frame_format* f) {
  [v->instance getRequestedFormat:f];
  return VC_SUCCESS;
}

int vcapture_set_requested_format(vcapture v, const struct vcapture_frame_format* f) {
  return [v->instance setRequestedFormat:f];
}

//////////////////////////////////////////////////////////////////////

int vcapture_set_callback(vcapture v,
                          vcapture_frame_callback callback,
                          void* userdata) {
  [v->instance setCallback:callback withUserData:userdata];
  return VC_SUCCESS;
}

vcapture_frame_callback vcapture_get_callback(vcapture v) {
  return [v->instance callback];
}

//////////////////////////////////////////////////////////////////////

int vcapture_active(vcapture v) {
  return [v->instance active];
}

int vcapture_start(vcapture v) {
  return [v->instance start];
}

int vcapture_stop(vcapture v) {
  return [v->instance stop];
}

testvideo.cpp
Code:
#include "video_capture.h"
#include <stdio.h>
#include <iostream>
#include <vector>
#include <png.h>

vcapture_frame_format gFormat;
std::vector<png_byte> gBuffer;
int gShouldRun = 1;

void dump_png();

void console_callback(const vcapture_frame_format* format,
                      const vcapture_frame* frame,
                      unsigned long long int timestamp,
                      void* userdata) {

  if (gShouldRun) {
    gShouldRun = 0;
    gFormat = *format;
    gBuffer.resize(format->height * format->bytes_per_row);
    memcpy(&(gBuffer[0]), frame->planes[0], gBuffer.size());
  }

}

int main(int argc, char** argv) {
  
  vcapture v = vcapture_alloc();

  vcapture_frame_format format;
  memset(&format, 0, sizeof(format));
  format.type = VC_TYPE_RGB;
  format.width = 320;
  format.height = 240;
  
  vcapture_set_requested_format(v, &format);

  vcapture_set_callback(v, console_callback, 0);
  printf("callback is %p\n", vcapture_get_callback(v));

  int n = vcapture_device_count(v);

  for (int i=0; i<n; ++i) {
    printf("device %d is '%s'\n", i, vcapture_device_name(v, i));
  }

  if (n) {
    int status = vcapture_set_device(v, 0);
    if (status == 0) {
      printf("set device!\n");
      status = vcapture_start(v);
      if (status == 0) {
        printf("started!\n");
        while (gShouldRun) { vcapture_poll(1000); }
        vcapture_stop(v);
      }
    }
  }

  vcapture_free(v);

  if (gBuffer.size() && gFormat.type == VC_TYPE_RGB) {
    dump_png();
  }

  return 0;

}


void dump_png() {

  FILE* fp = fopen("frame.png", "wb");
  if (!fp) {
    std::cerr << "couldn't open frame.png for output!\n";
    return;
  }
  
  png_structp png_ptr = png_create_write_struct
    (PNG_LIBPNG_VER_STRING, NULL, NULL, NULL);

  if (!png_ptr) {
    std::cerr << "error creating png write struct\n";
    return;
  }
  
  png_infop info_ptr = png_create_info_struct(png_ptr);
  if (!info_ptr) {
    std::cerr << "error creating png info struct\n";
    png_destroy_write_struct(&png_ptr, (png_infopp)NULL);
    fclose(fp);
    return;
  }  

  if (setjmp(png_jmpbuf(png_ptr))) {
    std::cerr << "error in png processing\n";
    png_destroy_write_struct(&png_ptr, &info_ptr);
    fclose(fp);
    return;
  }

  png_init_io(png_ptr, fp);

  int w = gFormat.width;
  int h = gFormat.height;

  png_set_IHDR(png_ptr, info_ptr,
           w, h,
           8,
           PNG_COLOR_TYPE_RGB,
           PNG_INTERLACE_NONE,
           PNG_COMPRESSION_TYPE_DEFAULT,
           PNG_FILTER_TYPE_DEFAULT);

  png_write_info(png_ptr, info_ptr);

  png_bytep rowptr = &(gBuffer[0]);

  std::cerr << "w = " << w << ", h = " << h << ", stride = " << gFormat.bytes_per_row << "\n";

  for (int y=0; y<h; ++y) {
    png_write_row(png_ptr, rowptr);
    rowptr += gFormat.bytes_per_row;
  }

  png_write_end(png_ptr, info_ptr);

  png_destroy_write_struct(&png_ptr, &info_ptr);

  fclose(fp);


}
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  How to handle file path site broken kamelito 2 3,031 Jul 25, 2009 02:45 AM
Last Post: kamelito
  framework search path scarypajamas 0 3,299 Dec 23, 2008 01:50 PM
Last Post: scarypajamas
  HID Manager In Leopard Blacktiger 3 4,137 Nov 12, 2007 08:42 AM
Last Post: Blacktiger
  64-bit in Leopard OneSadCookie 12 6,245 Aug 10, 2006 08:52 AM
Last Post: SethWillits
  XCode path LWStrike 4 4,624 Apr 6, 2006 05:11 PM
Last Post: LWStrike