< Previous PageNext Page > Hide TOC

Adding Audio Input and DV Camera Support

If you’ve worked through the sequence of steps outlined in the previous chapter, you’re now ready to extend the functionality of your QTKit capture player application.

In this chapter, you’ll add audio input capability to your capture application, as well as support for input from DV cameras other than your built-in or attached iSight camera. This is accomplished with only a dozen lines of Objective-C code, with error handling included.

In this section:

Add Instance Variables
Modify Methods


Add Instance Variables

Follow these steps to add audio input capability and video input from DV cameras.

  1. Launch Xcode 3 and open your MyRecorder project. Click the MyRecorderController.h declaration file. You need to modify and add the following instance variables, so the code looks like this:

     
    #import <Cocoa/Cocoa.h>
    #import <QTKit/QTkit.h>
     
    @interface MyRecorderController : NSObject {
        IBOutlet QTCaptureView      *mCaptureView;
     
        QTCaptureSession            *mCaptureSession;
        QTCaptureMovieFileOutput    *mCaptureMovieFileOutput;
        QTCaptureDeviceInput        *mCaptureVideoDeviceInput;
        QTCaptureDeviceInput        *mCaptureAudioDeviceInput;
    }
    - (IBAction)startRecording:(id)sender;
    - (IBAction)stopRecording:(id)sender;
     
    @end
  2. Notably, you’ve added two instance variables that point to the QTCaptureDeviceInput class. These are the audio and video input device variables that enable you to capture audio, as well as video from external DV cameras.

        QTCaptureDeviceInput             *mCaptureVideoInputDevice;
        QTCaptureDeviceInput             *mCaptureAudioInputDevice;

Modify Methods

Now open your MyRecorderController.m implementation file.

  1. Scroll down to the code block that begins with // Find a video device. After the following block of code, which you need in order to find a video device, such as the iSight camera, you’ll add a new block.

    // Find a video device
    QTCaptureDevice *videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
    success = [videoDevice open:&error];
  2. Add this block, which enables you to find and open a muxed video input device, such as a DV camera. (Note that in a muxed video, the audio and video tracks are mixed together.)

    // If a video input device can't be found or opened, try to find and open a muxed input device
        if (!success) {
            videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
            success = [videoDevice open:&error];
    }
    if (!success) {
       [videoDevice = nil;
        // Handle error
    }
    if (videoDevice) {
  3. Scroll down to the block of code that begins with the comment //Add the video device to the session as a device input. After that block, add the following lines, which add support for audio from an audio input device. Note that you’ve added an audio type of QTMediaTypeSound to the video device to handle the chores of capturing your audio stream in your capture session.

            mCaptureVideoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice];
            success = [mCaptureSession addInput:mCaptureVideoDeviceInput error:&error];
            if (!success) {
                // Handle error
            }
     
        // If the video device doesn't also supply audio, add an audio device input to the session
     
            if (![videoDevice hasMediaType:QTMediaTypeSound] && ![videoDevice hasMediaType:QTMediaTypeMuxed]) {
     
                QTCaptureDevice *audioDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeSound];
                success = [audioDevice open:&error];
     
                if (!success) {
                    audioDevice = nil;
                    // Handle error
                }
     
                if (audioDevice) {
                    mCaptureAudioDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:audioDevice];
     
                    success = [mCaptureSession addInput:mCaptureAudioDeviceInput error:&error];
                    if (!success) {
                        // Handle error
                    }
                }
            }
     
    // Create the movie file output and add it to the session
     
            mCaptureMovieFileOutput = [[QTCaptureMovieFileOutput alloc] init];
            success = [mCaptureSession addOutput:mCaptureMovieFileOutput error:&error];
            if (!success) {
                // Handle error
            }
     
            [mCaptureMovieFileOutput setDelegate:self];
     
    // Associate the capture view in the UI with the session
     
            [mCaptureView setCaptureSession:mCaptureSession];
     
            [mCaptureSession startRunning];
        }
     
    }
     
    // Handle window closing notifications for your device input
     
    - (void)windowWillClose:(NSNotification *)notification
    {
     
        [mCaptureSession stopRunning];
     
        if ([[mCaptureVideoDeviceInput device] isOpen])
            [[mCaptureVideoDeviceInput device] close];
     
        if ([[mCaptureAudioDeviceInput device] isOpen])
            [[mCaptureAudioDeviceInput device] close];
     
    }
     
    // Handle deallocation of memory for your capture objects
     
    - (void)dealloc
    {
        [mCaptureSession release];
        [mCaptureVideoDeviceInput release];
        [mCaptureAudioDeviceInput release];
        [mCaptureMovieFileOutput release];
     
        [super dealloc];
    }
  4. Add these start and stop recording actions, and specify the output destination for your recorded media. The output is a QuickTime movie.

     
    - (IBAction)startRecording:(id)sender
    {
        [mCaptureMovieFileOutput recordToOutputFileURL:[NSURL fileURLWithPath:@"/Users/Shared/My Recorded Movie.mov"]];
    }
     
    - (IBAction)stopRecording:(id)sender
    {
        [mCaptureMovieFileOutput recordToOutputFileURL:nil];
    }
     
    // Do something with your QuickTime movie at the path you've specified at /Users/Shared/My Recorded Movie.mov"
     
    - (void)captureOutput:(QTCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL forConnections:(NSArray *)connections dueToError:(NSError *)error
    {
        [[NSWorkspace sharedWorkspace] openURL:outputFileURL];
    }
     
     
    @end

Now you’re ready to build and compile your QTKit capture application. Once you’ve launched the application, you can begin to capture audio from your iSight camera or audio/video from a DV camera. The output is again recorded as a QuickTime movie, and then automatically opened in QuickTime Player on your desktop.

In the next chapter you’ll take on another coding assignment, this time creating a QTKit capture application that enables you to grab single frames from a video stream and output those frames, with great accuracy and reliability (avoiding tearing, for example), into a QuickTime movie. You’ll work with a technique that is common in the movie and TV industries, namely, stop or still motion animation.



< Previous PageNext Page > Hide TOC


© 2007 Apple Inc. All Rights Reserved. (Last updated: 2007-10-31)


Did this document help you?
Yes: Tell us what works for you.
It’s good, but: Report typos, inaccuracies, and so forth.
It wasn’t helpful: Tell us what would have helped.