The Mac OS X and iPhone OS platforms were built to provide a rich multimedia experience. To support that experience, both platforms provide plenty of support for loading and using image, sound, and video resources in your application. Image resources are commonly used to draw portions of an application’s user interface. Sound and video resources are used less frequently but can also enhance the basic appearance and appeal of an application. The following sections describe the support available for working with image, sound, and video resources in your applications.
Images and Sounds in Nib Files
Loading Image Resources
Playing Audio Files
Playing Video Resources
Using Interface Builder, you can reference your application’s sound and image files from within nib files. You might do so to associate those images or sounds with different properties of a view or control. For example, you might set the default image to display in an image view or set the image to display for a button. Creating such a connection in Interface Builder saves you the hassle of having to make that connection later when the nib file is loaded.
To make image and sound resources available in Interface Builder 3.0 and later, all you have to do is add them to your Xcode project. Interface Builder automatically searches your Xcode project for resources and lists them in the library window. When you make a connection to a given resource file, Interface Builder makes a note of that connection in the nib file. At load time, the nib-loading code looks for that resource in the project bundle, where it should have been placed by Xcode at build time.
When you load a nib file that contains references to image and sound resources, the nib-loading code caches resources whenever possible for easy retrieval later. For example, after loading a nib file, you can retrieve an image associated with that nib file using the imageNamed:
method of either NSImage
or UIImage
(depending on your platform). Similarly, you can retrieve cached sound resources in Mac OS X using the soundNamed:
method of NSSound
. Interface Builder does not cache sound resources in iPhone OS nor does it cache image and sound resources associated with Carbon nib files.
Image resources are commonly used in most applications. Even very simple applications use images to create a custom look for controls and views. Mac OS X and iPhone OS provide extensive support for manipulating image data using Objective-C objects. These objects make using image images extremely easy, often requiring only a few lines of code to load and draw the image. If you prefer not to use the Objective-C objects, you can also use Quartz to load images using a C-based interface. The following sections describe the process for loading image resource files using each of the available techniques.
To load images in Objective-C, you use either the NSImage
or UIImage
object, depending on the current platform. Applications built for Mac OS X using the AppKit framework use the NSImage
object to load images and draw them. Applications built for iPhone OS use the UIImage
object. Functionally, both of these objects provide almost identical behavior when it comes to loading existing image resources. You initialize the object by passing it a pointer to the image file in your application bundle and the image object takes care of the details of loading and drawing the image data.
Listing 4-1 shows how to load an image resource using the NSImage
class. After you locate the image resource, which in this case is in the application bundle, you simply use that path to initialize the image object. After initialization, you can draw the image using the methods of NSImage
or pass that object to other methods that can use it. To perform the exact same task in iPhone OS, all you would need to do is change references of NSImage
to UIImage
.
Listing 4-1 Loading an image resource
NSString* imageName = [[NSBundle mainBundle] pathForResource:@"image1" ofType:@"png"]; |
NSImage* imageObj = [[NSImage alloc] initWithContentsOfFile:imageName]; |
You can use image objects to open any type of image supported on the target platform. Each object is typically a lightweight wrapper for more advanced image handling code. To draw an image in the current graphics context, you would simply use one of its drawing related methods. Both NSImage
and UIImage
have methods for drawing the image in several different ways. The NSImage
class also provides extra support for manipulating the images you load.
For information about the methods of the NSImage
and UIImage
classes, see NSImage Class Reference and UIImage Class Reference. For more detailed information about the additional features of the NSImage
class, see Images in Cocoa Drawing Guide.
If you are writing C-based code, you can use a combination of Core Foundation and Quartz calls to load image resources into your applications. Core Foundation provides the initial support for locating image resources and loading the corresponding image data into memory. Quartz takes the image data you load into memory and turns it into a usable CGImageRef
that your code can then use to draw the image.
There are two ways to load images using Quartz: data providers and image source objects. Data providers are available in both iPhone OS and Mac OS X. Image source objects are available only in Mac OS X v10.4 and later but take advantage of the Image I/O framework to enhance the basic image handling capabilities of data providers. When it comes to loading and displaying image resources, both technologies are well suited for the job. The only time you might prefer image sources over data providers is when you want greater access to the image-related data.
Listing 4-2 shows how to use a data provider to load a JPEG image. This method uses the Core Foundation bundle support to locate the image in the application’s main bundle and get a URL to it. It then uses that URL to create the data provider object and then create a CGImageRef
for the corresponding JPEG data. (For brevity this example omits any error-handling code. Your own code should make sure that any referenced data structures are valid.)
Listing 4-2 Using data providers to load image resources
CGImageRef MyCreateJPEGImageRef (const char *imageName); |
{ |
CGImageRef image; |
CGDataProviderRef provider; |
CFStringRef name; |
CFURLRef url; |
CFBundleRef mainBundle = CFBundleGetMainBundle(); |
// Get the URL to the bundle resource. |
name = CFStringCreateWithCString (NULL, filename, kCFStringEncodingUTF8); |
url = CFBundleCopyResourceURL(mainBundle, name, CFSTR("jpg"), NULL); |
CFRelease(name); |
// Create the data provider object |
provider = CGDataProviderCreateWithURL (url); |
CFRelease (url); |
// Create the image object from that provider. |
image = CGImageCreateWithJPEGDataProvider (provider, NULL, true, |
kCGRenderingIntentDefault); |
CGDataProviderRelease (provider); |
return (image); |
} |
For detailed information about working with Quartz images, see Quartz 2D Programming Guide. For reference information about data providers, see Quartz 2D Reference Collection (Mac OS X) or Core Graphics Framework Reference (iPhone OS).
Audio resources are typically used to provide audio feedback for different parts of your application. Several technologies are available to handle the loading and playback of audio. Which technology you use is going to be determined by the underlying platform and the level of sophistication you need for handling the audio. The following sections describe the key technologies you might use and when you would use them.
Both Mac OS X and iPhone OS support the playback of audio files using the Core Audio family of frameworks. Core Audio provides a wide range of audio services, including the playback of essentially any kind of audio file you can imagine. For basic playback, Core Audio offers two mechanisms, both available in the Audio Toolbox framework:
To play short sound files of under five seconds duration when you do not need level control or other control, use System Audio Services.
To play longer sound files, to exert control over playback including level adjustments, or to play multiple sounds simultaneously, use Audio Queue Services.
Listing 4-3 shows a short program that uses the interfaces in System Audio Services to play a sound. Before playing the sound, it registers it and creates a sound ID for it. To play the sound, it then passes this sound ID to the AudioServicesPlaySystemSound
function. When the sound is finished playing, Core Audio notifies the application by calling its audio completion callback routine. This routine handles the clean up of the sound ID prior to the program exiting.
Listing 4-3 Playing a sound using System Audio Services
#include <AudioToolbox/AudioToolbox.h> |
#include <CoreFoundation/CoreFoundation.h> |
// Define a callback to be called when the sound is finished |
// playing. Useful when you need to free memory after playing. |
static void MyCompletionCallback ( |
SystemSoundID mySSID, |
void * myURLRef |
) { |
AudioServicesDisposeSystemSoundID (mySSID); |
CFRelease (myURLRef); |
CFRunLoopStop (CFRunLoopGetCurrent()); |
} |
int main (int argc, const char * argv[]) { |
// Set up the pieces needed to play a sound. |
SystemSoundID mySSID; |
CFURLRef myURLRef; |
myURLRef = CFURLCreateWithFileSystemPath ( |
kCFAllocatorDefault, |
CFSTR ("../../ComedyHorns.aif"), |
kCFURLPOSIXPathStyle, |
FALSE |
); |
// create a system sound ID to represent the sound file |
OSStatus error = AudioServicesCreateSystemSoundID (myURLRef, &mySSID); |
// Register the sound completion callback. |
// Again, useful when you need to free memory after playing. |
AudioServicesAddSystemSoundCompletion ( |
mySSID, |
NULL, |
NULL, |
MyCompletionCallback, |
(void *) myURLRef |
); |
// Play the sound file. |
AudioServicesPlaySystemSound (mySSID); |
// Invoke a run loop on the current thread to keep the application |
// running long enough for the sound to play; the sound completion |
// callback later stops this run loop. |
CFRunLoopRun (); |
return 0; |
} |
For more information about the features of Core Audio, see Core Audio Overview. For information and examples of how to play sounds using the Audio Queue Services technology, see Audio Queue Services Programming Guide.
In Mac OS X, the AppKit framework provides support for loading and playing sound files through the NSSound
class. You can use this class to play back sounds stored as AIFF, WAV, and NeXT .snd
files. For sound resources located in your application’s bundle, the simplest way to load a sound is using the soundNamed:
method, as shown in the following example:
NSSound* aSound = [NSSound soundNamed:@"mySound"]; |
The soundNamed:
method checks the application’s sound cache for an existing sound resource with the specified name. If the specified resource is not currently in the sound cache, NSSound
automatically searches for it in several other locations, including your application’s main bundle and any system Library/Sounds
directories.
Because the soundNamed:
method also loads system sound names, you should avoid using the names of system sounds when naming any of your custom sound files. Cocoa populates the sound cache with any sound files it needs, such as the file used for the current system alert sound. It caches these sounds under the filename of the sound (minus its filename extension). If one of your custom sounds matches the name of a different sound file that is already cached, the soundNamed:
method returns the cached file instead of your custom one.
If you want to ensure that the correct sound file resource is loaded every time, you can always load the sound file using an explicit path string, as shown in the following example.
NSString* soundFile = [[NSBundle mainBundle] pathForResource:@"mySound" ofType:@"aiff"]; |
NSSound* sound = [[NSSound alloc] initWithContentsOfFile:soundFile byReference:YES]; |
Note: Sound files associated with a nib file are loaded automatically when the nib file is loaded. To access those sounds, use the soundNamed:
method of NSSound
, passing in the name of the sound. For more information, see “About Image and Sound Resources.”
For more information about using the NSSound
class, see Sound Programming Topics for Cocoa and NSSound Class Reference. If you want to load sound resources for Carbon-based applications, you must use QuickTime or Core Audio to do so. For information about the QuickTime Kit framework, see QuickTime Kit Programming Guide and QTKit Framework Reference. For general information about the QuickTime framework, see QuickTime Overview.
Video resources are prerendered movie files that you can play from your application’s user interface. Games often use prerendered movies as cut scenes between different levels. The following sections provide information about how to load these types of resources and play them in your applications.
Video files are like any other resource files in your application. Once you locate the resource file, you can use an appropriate technology to open and play it. In Mac OS X, you use the QuickTime or Quicktime Kit frameworks to open video files, associate them with a graphics context, and play their contents. These frameworks support the playback of both video and audio files in either C or Objective-C code.
The following example loads a video file from an application’s bundle and associates it with a view using the QuickTime Kit framework. The view object returned by the getMyQTMovieView
method is assumed to be a QTMovieView
object located in one of the caller’s windows.
NSString* movieFile = [[NSBundle mainBundle] pathForResource:@"myMovie" ofType:@"mov"]; |
QTMovie* aMovie = [QTMovie movieWithFile:movieFile error:nil]; |
// Install the movie in a custom movie view associated with the caller. |
QTMovieView* myView = [self getMyQTMovieView]; |
[myView setMovie:aMovie]; |
Prior to Mac OS X v10.4, you can use the NSMovie
and NSMovieView
classes in Cocoa to load and display video files. In Mac OS X v10.4 and later, it is recommended that you use the classes of the QuickTime Kit framework instead.
For C-based applications, you can load video files using either the QuickTime framework or the QuickTime Kit framework. If you choose to use the QuickTime Kit framework, you must incorporate Objective-C code into your project. For information on how to use Objective-C code in Carbon applications, see Carbon-Cocoa Integration Guide.
For information about the QuickTime Kit framework, see QuickTime Kit Programming Guide and QTKit Framework Reference. For general information about the QuickTime framework, see QuickTime Overview. For details of how to incorporate movie content into your application, see QuickTime Movie Basics.
iPhone OS supports the ability to play back video files directly from your application using the Media Player framework (MediaPlayer.framework
). Video playback is supported in full screen mode only and can be used by game developers who want to play cut scene animations or by other developers who want to play media files. When you start a video from your application, the media player interface takes over, fading the screen to black and then fading in the video content. You can play a video with or without transport controls; enabling transport controls lets the user pause or adjust the playback of the video. If you do not enable these controls, the video plays until completion or until you explicitly stop it in your code.
To initiate video playback, you must know the URL of the file you want to play. For files your application provides, this would typically be a pointer to a file in your application’s bundle; however, it can also be a pointer to a file on a remote server or elsewhere in the directory containing your application. You use this URL to instantiate a new instance of the MPMoviePlayerController
class. This class presides over the playback of your video file and manages user interactions, such user taps in the transport controls (if shown). To initiate playback, simply call the play
method of the controller.
Listing 4-4 shows a sample method that playbacks the video at the specified URL. The play method is an asynchronous call that returns control to the caller while the movie plays. The movie controller loads the movie in a full-screen view, and animates the movie into place on top of the application’s existing content. When playback is finished, the movie controller sends a notification to the object, which releases the movie controller now that it is no longer needed.
Listing 4-4 Playing full screen movies.
-(void)playMovieAtURL:(NSURL*)theURL |
{ |
MPMoviePlayerController* thePlayer = [[MPMoviePlayerController alloc] initWithContentURL:theURL]; |
thePlayer.scalingMode = MPMovieScalingModeAspectFill; |
thePlayer.userCanShowTransportControls = NO; |
// Register for the playback finished notification. |
[[NSNotificationCenter defaultCenter] addObserver:self |
selector:@selector(myMovieFinishedCallback:) |
name:MPMoviePlayerPlaybackDidFinishNotification |
thePlayer]; |
// Movie playback is asynchronous, so this method returns immediately. |
[thePlayer play]; |
} |
// When the movie is done, release the controller. |
-(void)myMovieFinishedCallback:(NSNotification*)aNotification |
{ |
MPMoviePlayerController* thePlayer = [aNotification object]; |
[[NSNotificationCenter defaultCenter] removeObserver:self |
name:MPMoviePlayerPlaybackDidFinishNotification |
thePlayer]; |
// Release the movie instance created in playMovieAtURL: |
[thePlayer release]; |
} |
© 2009 Apple Inc. All Rights Reserved. (Last updated: 2009-01-06)