iChat Theater allows applications to send additional audio and video tracks during an AV chat. You can use any NSView
as a prebuilt video source or provide the auxiliary video through periodic callbacks for individual frames. Audio is provided through an audio device and channels.
Before implementing a video source, you should select the buffer type—a pixel buffer or an OpenGL buffer—that is most efficiently filled by your application during a callback. The pixel buffer is filled in the main memory—by the CPU rather than the GPU. If you are rendering content using OpenGL, then you typically use the OpenGL buffer.
There are several steps involved in using iChat Theater in your application:
Set the video data source and any video options.
If you are not using an NSView
object as the video source, implement the callbacks that provide individual video frames.
If you're using pixel buffers, implement the pixel buffer methods using Core Video. If you're using OpenGL, implement the OpenGL methods.
Create audio channels and manage them using Core Audio.
Use the start and stop methods to control the video playback.
Register for state change notifications.
You must register for notifications to establish a connection to iChat Theater.
The rest of this article explains how to do each of these steps. Read the “Using Views as Video Data Sources” article for details on how to use an NSView
object as the video data source. Read the “Using Pixel Buffers” and “Using OpenGL Buffers” articles for details on implementing the IMVideoDataSource
protocol.
Getting the Manager
Setting the Video Data Source
Setting Video Options
Implementing the Video Data Source
Creating Audio Channels
Controlling Video Playback
Registering for the State Change Notification
The first step in using iChat Theater is to get the shared manager object that controls auxiliary audio and video playback. The sharedAVManager
class method returns the shared IMAVManager
object. This code fragment gets the state of the shared IMAVManager
object:
IMAVManagerState state = [[IMAVManager sharedAVManager] state]; |
See IMAVManager Class Reference for descriptions of the different states returned by the state
method.
Your application provides the auxiliary video content that is sent over iChat Theater. This is accomplished using a delegation model. You set a video data source object that conforms to a defined protocol and the Instant Message framework sends a message to the data source object when it needs the next video frame. Hence, messages are sent periodically to your video data source object during playback.
For example, this code fragment sets the video data source for the shared IMAVManager
object using the setVideoDataSource:
method, then sets some optimization options using the setVideoOptimizationOptions:
method, and starts the video playback using the start
method:
Listing 1 Setting the video data source
IMAVManager *avManager = [IMAVManager sharedAVManager]; |
[avManager setVideoDataSource:videoDataSource]; |
[avManager setVideoOptimizationOptions:IMVideoOptimizationStills]; |
[avManager start]; |
Use the setVideoOptimizationOptions:
method to give hints to the IMAVManager
object so it can optimize the video playback based on the type of video source.
For example, use the IMVideoOptimizationStills
option if you are sharing a slideshow as shown in Listing 1. This option is a hint to iChat Theater that the video doesn’t change for long periods of time. Consequently, iChat Theater assumes the video does not require much bandwidth to encode and send. However, if the video is full-motion, then setting this option has a negative impact on performance.
Use the IMVideoOptimizationReplacement
option if you want to force iChat Theater to replace the outgoing local user’s video with your video data source instead of displaying both video sources side-by-side. If you set this option, iChat can devote full CPU and bandwidth resources to the iChat Theater video. However, if you do not set this option, there’s no guarantee that side-by-side video is used. iChat may replace the local video under certain circumstances—for example, it may replace the video if video chatting with a buddy on Mac OS X v10.4 and earlier, with multiple buddies, or over a slow connection.
Your video data source needs to conform to the IMVideoDataSource
informal protocol. You should select the type of buffer that is most efficient for your application.
If you’re using pixel buffers, then implement the getPixelBufferPixelFormat:
and renderIntoPixelBuffer:forTime:
methods. Read “Using Pixel Buffers” for tips on how to implement these methods.
If you’re using OpenGL, then implement the getOpenGLBufferContext:pixelFormat:
and renderIntoOpenGLBuffer:onScreen:forTime:
methods. Read “Using OpenGL Buffers” for tips on how to implement these methods.
For performance reasons, all of these callbacks are not invoked on the main thread. If you are using OpenGL, which is not thread-safe, to render to both the screen and buffer, then you need to take some extra precautions. Read “Using OpenGL Buffers” to learn more about how to use OpenGL in a multithreaded application.
The audio tracks are not handled the same way as the video tracks. You set the number of audio channels before playing any AV using the setNumberOfAudioChannels:
method. Currently, the audio can either be mono or stereo. You access the audio device and channels using the audioDeviceUID
and audioDeviceChannels
methods respectively. Use these methods when the shared IMAVManager
is in the IMAVRunning
state; otherwise, they return nil
.
Use Core Audio to manage the channels and create audio content. For example, use the AudioHardwareGetProperty
function in Core Audio by passing kAudioHardwarePropertyDeviceForUID
and the value returned by audioDeviceUID
to obtain the device. Read Core Audio Overview to get started with audio and Core Audio Framework Reference for details on Core Audio.
You can also play any NSSound
over iChat Theater using the setPlaybackDeviceIdentifier:
and setChannelMapping:
methods of NSSound
. Listing 2 shows how to use these method. See NSSound Class Reference for details on the setPlaybackDeviceIdentifier:
and setChannelMapping:
methods.
The playMonoForiChat:
method in Listing 2 is intended to be a category method that you add to NSSound
. If the sound has one channel, then use the playStereoForiChat:
method instead of the play
method of NSSound
to play the sound over iChat Theater. There’s a similar category method in the sample code if the sound is stereo.
Listing 2 Playing sounds over iChat Theater
- (BOOL) playMonoForiChat:(BOOL)flag { |
if (flag) { |
// Set the audio output device. |
IMAVManager *avManager = [IMAVManager sharedAVManager]; |
[self setPlaybackDeviceIdentifier:[avManager audioDeviceUID]]; |
// Get the channel info for iChat Theater. |
NSArray *channels = [avManager audioDeviceChannels]; |
NSUInteger channelCount = [channels count]; |
// For a mono sound, map its single channel to those of the IMAVManager |
NSArray *mapping = (channelCount > 0) ? [NSArray arrayWithObject:channels] : nil; |
[self setChannelMapping:mapping]; |
} else { |
// Use default playback device and channel mapping. |
[self setPlaybackDeviceIdentifier:nil]; |
[self setChannelMapping:nil]; |
} |
return [self play]; |
} |
After you set the video data source and create your audio channels, you are ready to start playing AV content in iChat. You simply send start
to the shared IMAVManager
object to play, and stop
to stop the AV content. The IMAVManager
object transitions through several states during playback.
When you send start
to a stopped IMAVManager
object, it changes state from IMAVRequested
to IMAVStartingUp
, then to IMAVPending
, and finally to IMAVRunning
. When you invoke the start
method, the state changes immediately to IMAVStartingUp
and the method returns. The IMAVManager
object asynchronously transitions to the other states.
Conversely, when you send stop
to a running IMAVManager
object, it changes state from IMAVRunning
, to IMAVShuttingDown
, and then to IMAVRequested
. When you invoke the stop
method, the state changes immediately to IMAVShuttingDown
and the method returns. The IMAVManager
object asynchronously transitions to IMAVRequested
. The stop
method returns immediately if the IMAVManager
object is not in the IMAVRunning
state.
When using the iChat Theater API, the IMAVManager
object can be in a number of different states at anytime—for example, depending on whether or not you invoke the start
or stop
method. Even after invoking these methods, the state of the IMAVManager
object is not guaranteed because errors can occur while transitioning from a stopped to a running state or another application using the iChat Theater API can cause state transitions you might not expect. Invoking other methods while IMAVManager
is not in an expected state can raise exceptions or do nothing.
Typically, you register for the IMAVManagerStateChangedNotification
notification to be notified when the shared IMAVManager
object changes state and then use the state
method to get the new state. You should register for this notification early in your application, before sending state
to the shared IMAVManager
object, because registering for this notification establishes a connection to iChat Theater. Otherwise, state values returned by IMAVManager
may not be accurate.
© 2007 Apple Inc. All Rights Reserved. (Last updated: 2007-10-31)