< Previous PageNext Page > Hide TOC

Creating a QTKit Stop or Still Motion Application

Now that you've worked through the examples in the previous chapters of this guide—building and extending the functionality of your QTKit capture player application, adding audio and DV camera support—you'll be ready to take on another coding assignment. The goal here is, once again, to extend your knowledge of the QTKit capture API.

Because the API supports frame-accurate, real-time motion capture, you'll discover a broad range of possible uses and applications. One such usage is still or stop motion animation. This is a popular animation technique first introduced in Walt Disney’s 1959 classic film Noah’s Ark. Basically, it involves making still objects appear as if they are in motion by adding single frames together into a movie. It’s a way of animating objects and bringing them visually to life on the screen. Stop or still motion animators have employed this technique—making static objects appear to move when played back at normal speed—in countless movies, TV commercials, and TV shows since the days of Disney.

Following the steps outlined in this chapter, you'll construct a simple still motion capture application that lets you capture a live video feed, grab frames one at a time with great accuracy, and then record the output of those frames to a QuickTime movie. You’ll be able to accomplish this with less than 100 lines of Objective-C code, constructing the sample as you’ve done in previous chapters, in Xcode 3 and Interface Builder 3.

In building your still motion capture application, you’ll work with the following three QTKit classes:

In this section:

Set Up Your Project
Prototype the Still Motion Capture Application
Create the Project Using Xcode 3
Create the User Interface Using Interface Builder 3
Prepare to Capture Single Frame Video
Complete the Project Nib File in Xcode
Implement and Build Your Still Motion Capture Application


Set Up Your Project

If you’ve come to this chapter without working through the code examples in the previous chapters of this guide, you may want to return to the sections beginning with “First Steps.” Those sections provide you with a background understanding of how to work with Xcode and Interface Builder.

You’ll need, as described in previous chapters, to be running Mac OS X v10.5 and have the following items installed on your system:

Prototype the Still Motion Capture Application

Just as you’ve done in the section “Prototype the Capture Player,” you may want to start by creating a rough sketch of your QTKit still motion capture application. Think, again, of what design elements you want to incorporate into the application. Rather than simply jumping into Interface Builder and doing your prototype there, you may want to visualize the elements first in your rough sketch, as shown in Figure 4-2.


Figure 4-1  Prototype sketch of QTKit still motion capture application

Prototype sketch of QTKit still motion capture application

In this design prototype, you can start with three simple objects: a capture view, a QuickTime movie view, and a single button to add frames. These will be the building blocks for your application. You can add more complexity to the design later on. After you’ve sketched out your prototype, think how you’ll be able to hook up the objects in Interface Builder and what code you need in your Xcode project to make this happen.

Create the Project Using Xcode 3

To create the project, follow these steps:

  1. Launch Xcode 3 (shown in Figure 4-2) and choose File > New Project.

    Figure 4-2  The Xcode 3 icon

    The Xcode 3 icon
  2. When the new project window appears, select Cocoa Document-based Application.

  3. Name the project StillMotion and navigate to the location where you want the Xcode application to create the project folder. Now the Xcode project window appears, as shown in Figure 4-3.

    Figure 4-3  The StillMotion Xcode project window

    The StillMotion Xcode project window
  4. Next, you need to add the QuickTime Kit framework to your StillMotion project. This framework resides in the System/Library/Frameworks directory. Although obvious, this step is sometimes easy to forget. Note that you don’t need to add the QuickTime framework to your project, just the QuickTime Kit framework. Choose Project > Add to Project.

  5. Select QTKit.framework, and click Add when the Add To Targets window appears to add it to your project.

  6. Now you also need to add the Quartz Core framework to your project. It resides in the System/Library/Frameworks directory. Choose Project > Add to Project. Select QuartzCore.framework, and click Add when the Add To Targets window appears.

    Important: This completes the first sequence of steps in your project. In the next sequence, you’ll move ahead to define actions and outlets in Xcode before working with Interface Builder. This may involve something of a paradigm shift in how you may be used to building and constructing an application with versions of Interface Builder prior to Interface Builder 3. Because you’ve already prototyped your QTKit still motion capture application, at least in rough form with a clearly defined data model, you can now determine which actions and outlets need to be implemented. In this case, you have a QTCaptureView object, which is a subclass of NSView, a QTMovieView object to display your captured frames and one button to record your captured media content and add each single frame to your QuickTime movie output.

Import the QTKit Headers and Set Up Your Implementation File

  1. Double-click your MyDocument.h declaration file in your Xcode project and open it. In the file, delete the #import <Cocoa/Cocoa.h> statement and replace it with #import <QTKit/QTkit.h>.

  2. Double-click your MyDocument.m implementation file in your project to open it. Delete the contents of the file except for the following lines of code:

    #import "MyDocument.h"
    @implementation MyDocument
    - (NSString *)windowNibName
    {
        return @"MyDocument";
    }
    @end

Determine the Actions and Outlets You Want

  1. Now you can begin adding outlets and actions. In your MyDocument.h file, add the instance variables mCaptureView and mMovieView in the following lines of code:

    IBOutlet QTCaptureView *mCaptureView;
    IBOutlet QTMovieView   *mMovieView;
  2. You also want to add this action method:

    - (IBAction)addFrame:(id)sender;
  3. Now open your MyDocument.m file and add the same action method, followed by braces for the code you’ll add later to implement this action.

    - (IBAction)addFrame:(id)sender
    {
    }

This completes the second stage of your project. Now you’ll need to shift gears and work with Interface Builder 3 to construct the user interface for your project.

Create the User Interface Using Interface Builder 3

In the next phase of your project you’ll see how seamlessly Interface Builder and Xcode work together, enabling you to construct and implement the various elements in your project more efficiently and with less overhead.

  1. Open Interface Builder 3 (Figure 2-4) and click the MyDocument.nib file in your Xcode project window. Because of the new integration between Xcode 3 and Interface Builder 3, you’ll find the actions and outlets you’ve declared in your MyDocument.h file are also synchronously updated in Interface Builder 3. This will become apparent once you begin to work with your MyDocument nib file and the library of controls available in Interface Builder 3.

    Figure 4-4  The new Interface Builder 3 icon

    The new Interface Builder 3 icon
  2. In Interface Builder 3, you’ll find a new library of controls. Scroll down until you find the QuickTime Capture View control, as shown in Figure 4-5.

    Figure 4-5  The QuickTime Capture View object in the library

    The QuickTime Capture View object in the library

    The QTCaptureView object provides you with an instance of a view subclass to display a preview of the video output that is captured by a capture session.

  3. Drag the QTCaptureView object into your window and resize the object to fit the window, allowing room at the bottom for your Add Frame button (already shown in the illustration below) and to the right for your QTMovieView object in your QTKit still motion capture application.

    Choose Tools > Inspector. In the Identity Inspector, select the information (“i”) icon. Click in the field Class and your QTCaptureView object appears, as shown in Figure 4-6

    Figure 4-6  The QTCaptureView window and Class Identity in the Inspector

    The QTCaptureView window and Class identity in the Inspector
  4. Set the autosizing for the object in the Capture View Size Inspector, as shown in Figure 2-7.

    Figure 4-7  Setting the autosizing for your QTCaptureView object

    Setting the autosizing for your QTCaptureView object
  5. Now you want to repeat the same sequence of steps to add your QTMovieView object to your Window (already shown above). Scroll down in the Library of controls until you find the QTMovieView object, shown in Figure 4-8.

    Figure 4-8  The QTMovieView control in the Interface Builder library

    The QTMovieView control in the Interface Builder library
  6. Select the QTMovieView object (symbolized by the blue Q) and drag it into your Window next to the QTCaptureView object, shown below.

  7. Choose Tools > Inspector. In the Identity Inspector, select the information (“i”) icon. Click in the Class field and your QTMovieView object appears, as shown in Figure 4-9.

    Figure 4-9  The QTMovieView object defined in the field Class

    The QTMovieView object defined in the field Class
  8. Follow the procedure in step 4 above to set the autosizing for your QTMovieView object.

  9. Now you want to specify the Window attributes in your MyDocument.nib file. Select the Window object in your nib and click the attributes icon in the Inspector, as shown in Figure 4-10.

    Figure 4-10  The Window attributes in MyDocument.nib

    The Window attributes in MyDocument.nib
  10. Define the Window size you want in your MyDocument.nib by selecting the size icon (symbolized by a ruler) in the Window Inspector, shown in Figure 4-11.

    Figure 4-11  Defining the size of the Window

    Defining the size of the Window
  11. Specify the delegate outlet connections of File’s Owner in the Window Connections Inspector, as shown in Figure 4-12.

    Figure 4-12  Specifying the delegate outlet connection for File’s Owner

    Specifying the delegate outlet connection for File’s Owner
  12. In the Library, select the Push Button control and drag it to the Window. Enter the text Add Frame. In autosizing, set the struts for the button at the center and right outside corner, leaving the inside struts untouched, as shown in Figure 4-13.

    Figure 4-13  Specifying the autosizing for the Add Frame button

    Specifying the autosizing for the Add Frame button
  13. Select the MyDocument.nib and click the Connections Inspector. Now you want to wire up the outlets and received actions, as shown in Figure 4-14. Control-drag each outlet instance variable to the appropriate MyDocument.nib object.

    Figure 4-14  Specifiying the actions and outlet connections in MyDocument.nib

    Specifiying the actions and outlet connections in MyDocument.nib
  14. Select the File’s Owner object in your MyDocument.nib and then click the Class Identity icon in the Interface Builder Inspector, as shown in Figure 4-15. Note that the green light at the left corner of your StillMotion.xcodeproj is turned on, indicating that Xcode and Interface Builder have synchronized the actions and outlets in your project.

    Figure 4-15  The My Document Identity Inspector with File’s Owner selected

    The My Document Identity Inspector with File’s Owner selected

Prepare to Capture Single Frame Video

In the last phase of your project, you’ve seen how seamlessly Interface Builder and Xcode work together, enabling you to construct and implement the various elements in your project more efficiently and with less overhead. After completing the code you need to add your Xcode project––described in the next section, “Complete the Project Nib File in Xcode”––you’ll be ready to move ahead and capture single frame video, using your still motion capture application, as shown in Figure 4-16.


Figure 4-16   Preparing to capture single video frames and outputting those frames to a QuickTime movie

Preparing to capture single video frames and outputting those frames to a QuickTime movie

Complete the Project Nib File in Xcode

To complete the project nib file, you’ll need to define the instance variables that point to the capture session, as well as to the device input and decompressed video output objects.

  1. In your Xcode project, you need to add the instance variables to the interface declaration. Add these lines of code in your MyDocument.h declaration file:

    @interface MyDocument : NSDocument
    {
       QTMovie                                *mMovie;
       QTCaptureSession                       *mCaptureSession;
       QTCaptureDeviceInput                   *mCaptureDeviceInput;
       QTCaptureDecompressedVideoOutput       *mCaptureDecompressedVideoOutput;
    }

    The mMovie instance variable points to the QTMovie object while the mCaptureSession instance variable points to the QTCaptureSession object. Likewise, the *mCaptureDeviceInput instance variable points to the QTCaptureDeviceInput object while the next line declares that the mCaptureDecompressedVideoOutput instance variable points to the QTCaptureDecompressedVideoOutput object.

  2. There is one more instance variable you need to declare in this file: mCurrentImageBuffer. This instance variable stores the most recent frame that you’ve grabbed in a CVImageBufferRef. Add this line of code, following your last declaration:

     CVImageBufferRef                    mCurrentImageBuffer;
  3. That completes the code you need to add to your MyDocument.h file. Now you want to open your MyDocument.m file and prepare to add the following blocks of code to your project. Note that the code is commented for better understanding and comprehension.

    Important: There is a specific, though not necessarily rigid, order of steps you want to follow in constructing your code. Think of these as specific tasks you want to accomplish in your project.

    1. Create an empty movie that writes to mutable data in memory, using the initToWritableData: method.

    2. Set up a capture session that outputs the raw frames you want to grab.

    3. Find a video device and add a device input for that device to the capture session.

    4. Add a decompressed video output that returns the raw frames you’ve grabbed to the session and then previews the video from the session in the document window.

    5. Start the session, using the startRunning method you’ve used previously in the MyRecorder sample code.

    6. Call a delegate method whenever the QTCaptureDecompressedVideoOutput object receives a frame.

    7. Store the latest frame. Do this in a @synchronized block because the delegate method is not called on the main thread.

    8. Get the most recent frame. Do this in a @synchronized block because the delegate method that sets the most recent frame is not called on the main thread.

    9. Create an NSImage and add it to the movie.

  4. Following the steps outlined above, add this block of code to your MyDocument.m file.

    - (void)windowControllerDidLoadNib:(NSWindowController *) aController
    {
        NSError *error = nil;
        [super windowControllerDidLoadNib:aController];
        [[aController window] setDelegate:self];
        if (!mMovie) {
            // Create an empty movie that writes to mutable data in memory
            mMovie = [[QTMovie alloc] initToWritableData:[NSMutableData data] error:&error];
            if (!mMovie) {
                [[NSAlert alertWithError:error] runModal];
                return;
            }
        }
        [mMovieView setMovie:mMovie];
        if (!mCaptureSession) {
            // Set up a capture session that outputs raw frames
            BOOL success;
            mCaptureSession = [[QTCaptureSession alloc] init];
            // Find a video device
            QTCaptureDevice *device = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
            success = [device open:&error];
            if (!success) {
                [[NSAlert alertWithError:error] runModal];
                return;
            }
            // Add a device input for that device to the capture session
            mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:device];
            success = [mCaptureSession addInput:mCaptureDeviceInput error:&error];
            if (!success) {
                [[NSAlert alertWithError:error] runModal];
                return;
            }
            // Add a decompressed video output that returns raw frames to the session
            mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
            [mCaptureDecompressedVideoOutput setDelegate:self];
            success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];
            if (!success) {
                [[NSAlert alertWithError:error] runModal];
                return;
            }
            // Preview the video from the session in the document window
            [mCaptureView setCaptureSession:mCaptureSession];
     
            // Start the session
            [mCaptureSession startRunning];
        }
    }
  5. Add these lines to handle window closing notifications for your device input and stop the capture session:

    - (void)windowWillClose:(NSNotification *)notification
    {
        [mCaptureSession stopRunning];
        QTCaptureDevice *device = [mCaptureDeviceInput device];
        if ([device isOpen])
            [device close];
    }
  6. Insert the following block of code to handle deallocation of memory for your capture objects:

    - (void)dealloc
    {
        [mMovie release];
        [mCaptureSession release];
        [mCaptureDeviceInput release];
        [mCaptureDecompressedVideoOutput release];
        [super dealloc];
    }
  7. Add the following lines of code to specify the output destination for your recorded media, in this case an editable QuickTime movie:

    - (BOOL)readFromURL:(NSURL *)absoluteURL ofType:(NSString *)typeName error:(NSError **)outError
    {
        QTMovie *newMovie = [[QTMovie alloc] initWithURL:absoluteURL error:outError];
        if (newMovie) {
            [newMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
            [mMovie release];
            mMovie = newMovie;
        }
        return (newMovie != nil);
    }
    - (BOOL)writeToURL:(NSURL *)absoluteURL ofType:(NSString *)typeName error:(NSError **)outError
    {
        return [mMovie writeToFile:[absoluteURL path] withAttributes:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:QTMovieFlatten] error:outError];
    }
    - (BOOL)writeToURL:(NSURL *)absoluteURL ofType:(NSString *)typeName error:(NSError **)outError
    {
        return [mMovie writeToFile:[absoluteURL path] withAttributes:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:QTMovieFlatten] error:outError];
    }
  8. Add these lines of code to call a delegate method whenever the QTCaptureDecompressedVideoOutput object receives a frame:

     - (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
    {
        // Store the latest frame
        // This must be done in a @synchronized block because this delegate method is not called on the main thread
        CVImageBufferRef imageBufferToRelease;
     
        CVBufferRetain(videoFrame);
     
        @synchronized (self) {
            imageBufferToRelease = mCurrentImageBuffer;
            mCurrentImageBuffer = videoFrame;
        }
        CVBufferRelease(imageBufferToRelease);
    }
  9. Now you want to specify the addFrame: action method and get the most recent frame that you’ve grabbed. Do this in a @synchronized block because the delegate method that sets the most recent frame is not called on the main thread. Note that you’re wrapping a CVImageBufferRef object into an NSImage. After you create an NSImage, you can then add it to the movie.

    - (IBAction)addFrame:(id)sender
    {
        CVImageBufferRef imageBuffer;
        @synchronized (self) {
            imageBuffer = CVBufferRetain(mCurrentImageBuffer);
        }
        if (imageBuffer) {
            // Create an NSImage and add it to the movie
            NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:imageBuffer]];
            NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]] autorelease];
            [image addRepresentation:imageRep];
            CVBufferRelease(imageBuffer);
            [mMovie addImage:image forDuration:QTMakeTime(1, 10) withAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
    @"jpeg", QTAddImageCodecType, nil]];
            [mMovie setCurrentTime:[mMovie duration]];
            [mMovieView setNeedsDisplay:YES];
            [self updateChangeCount:NSChangeDone];
        }
    }

Implement and Build Your Still Motion Capture Application

After you’ve saved your project, click Build and Go. After compiling, click the Add Frame button to record each captured frame and output that frame to a QuickTime movie. The output of your captured session is saved as a QuickTime movie .

Now you can begin capturing and recording with your QTKit still motion capture player application. Using a simple iSight camera, your output appears as a three-frame QuickTime movie, as shown in Figure 4-17.


Figure 4-17  Still motion recorded output as a three-frame QuickTime movie

Still motion recorded output as a three-frame QuickTime movie



< Previous PageNext Page > Hide TOC


© 2007 Apple Inc. All Rights Reserved. (Last updated: 2007-10-31)


Did this document help you?
Yes: Tell us what works for you.
It’s good, but: Report typos, inaccuracies, and so forth.
It wasn’t helpful: Tell us what would have helped.