The graphics and multimedia capabilities of Mac OS X set it apart from other operating systems. Mac OS X is built on a modern foundation that includes support for advanced compositing operations with support for hardware-based rendering on supported graphics hardware. On top of this core are an array of technologies that provide support for drawing 2D, 3D, and video-based content. The system also provides an advanced audio system for the generation, playback, and manipulation of multichannel audio.
Drawing Technologies
Text and Fonts
Audio Technologies
Video Technologies
Color Management
Printing
Accelerating Your Multimedia Operations
Mac OS X includes numerous technologies for rendering 2D and 3D content and for animating that content dynamically at runtime.
Quartz is at the heart of the Mac OS X graphics and windowing environment. Quartz provides rendering support for 2D content and combines a rich imaging model with on-the-fly rendering, compositing, and anti-aliasing of content. It also implements the windowing system for Mac OS X and provides low-level services such as event routing and cursor management.
Quartz comprises both a client API (Quartz 2D) and a window server (Quartz Compositor). The client API provides commands for managing the graphics context and for drawing primitive shapes, images, text, and other content. The window server manages the display and device driver environment and provides essential services to clients, including basic window management, event routing, and cursor management behaviors.
The Quartz 2D client API is implemented as part of the Application Services umbrella framework (ApplicationServices.framework
), which is what you include in your projects when you want to use Quartz. This umbrella framework includes the Core Graphics framework (CoreGraphics.framework
), which defines the Quartz 2D interfaces, types, and constants you use in your applications.
The Quartz Services API (which is also part of the Core Graphics framework) provides direct access to some low-level features of the window server. You can use this API to get information about the currently connected display hardware, capture a display for exclusive use, or adjust display attributes, such as its resolution, pixel depth, and refresh rate. Quartz Services also provides some support for operating a Mac OS X system remotely.
For information about the Quartz 2D API, see Quartz 2D Programming Guide. For information about the Quartz Services API, see Quartz Display Services Programming Topics.
The Quartz imaging architecture is based on a digital paper metaphor. In this case, the digital paper is PDF, which is also the internal model used by Quartz to store rendered content. Content stored in this medium has a very high fidelity and can be reproduced on many different types of devices, including displays, printers, and fax machines. This content can also be written to a PDF file and viewed by any number of applications that display the PDF format.
The PDF model gives application developers much more control over the final appearance of their content. PDF takes into account the application’s choice of color space, fonts, image compression, and resolution. Vector artwork can be scaled and manipulated during rendering to implement unique effects, such as those that occur when the system transitions between users with the fast user switching feature.
Mac OS X also takes advantage of the flexibility of PDF in implementing some system features. For example, in addition to printing, the standard printing dialogs offer options to save a document as PDF, preview the document before printing, or transmit the document using a fax machine. The PDF used for all of these operations comes from the same source: the pages formatted for printing by the application’s rendering code. The only difference is the device to which that content is sent.
Quartz 2D provides many important features to user applications, including the following:
High-quality rendering on the screen
Internal compression of data
A consistent feature set for all printers
Automatic PDF generation and support for printing, faxing, and saving as PDF
Color management through ColorSync
Table 3-1 describes some of technical specifications for Quartz.
Bit depth | A minimum bit depth of 16 bits for typical users. An 8-bit depth in full-screen mode is available for Classic applications, games, and other multimedia applications. |
Minimum resolution | Supports 800 pixels by 600 pixels as the minimum screen resolution for typical users. A resolution of 640 x 480 is available for the iBook as well as for Classic applications, games, and other multimedia applications. |
Quartz takes advantage of any available vector unit hardware to boost performance. | |
Quartz Extreme uses OpenGL to draw the entire Mac OS X desktop. Graphics calls render in supported video hardware, freeing up the CPU for other tasks. |
Quartz Compositor, the window server for Mac OS X, coordinates all of the low-level windowing behavior and enforces a fundamental uniformity in what appears on the screen. It manages the displays available on the user’s system, interacting with the necessary device drivers. It also provides window management, event-routing, and cursor management behaviors.
In addition to window management, Quartz Compositor handles the compositing of all visible content on the user’s desktop. It supports transparency effects through the use of alpha channel information, which makes it possible to display drop shadows, cutouts, and other effects that add a more realistic and dimensional texture to the windows.
The performance of Quartz Compositor remains consistently high because of several factors. To improve window redrawing performance, Quartz Compositor supports buffered windows and the layered compositing of windows and window content. Thus, windows that are hidden behind opaque content are never composited. Quartz Compositor also incorporates Quartz Extreme, which speeds up rendering calls by handing them off to graphics hardware whenever possible.
Figure 3-1 shows the high-level relationships between Quartz Compositor and the rendering technologies available on Mac OS X. QuickTime and OpenGL have fewer dependencies on Quartz Compositor because they implement their own versions of certain windowing capabilities.
The Cocoa application environment provides object-oriented wrappers for many of the features found in Quartz. Cocoa provides support for drawing primitive shapes such as lines, rectangles, ovals, arcs, and Bezier paths. It supports drawing in both standard and custom color spaces and it supports content manipulations using graphics transforms. Because it is built on top of Quartz, drawing calls made from Cocoa are composited along with all other Quartz 2D content. You can even mix Quartz drawing calls (and drawing calls from other system graphics technologies) with Cocoa calls in your code if you wish.
For more information on how to draw using Cocoa, see Cocoa Drawing Guide.
OpenGL is an industry-wide standard for developing portable three-dimensional (3D) graphics applications. It is specifically designed for games, animation, CAD/CAM, medical imaging, and other applications that need a rich, robust framework for visualizing shapes in two and three dimensions. The OpenGL API is one of the most widely adopted graphics API standards, which makes code written for OpenGL portable and consistent across platforms. The OpenGL framework (OpenGL.framework
) in Mac OS X includes a highly optimized implementation of the OpenGL libraries that provides high-quality graphics at a consistently high level of performance.
OpenGL offers a broad and powerful set of imaging functions, including texture mapping, hidden surface removal, alpha blending (transparency), anti-aliasing, pixel operations, viewing and modeling transformations, atmospheric effects (fog, smoke, and haze), and other special effects. Each OpenGL command directs a drawing action or causes a special effect, and developers can create lists of these commands for repetitive effects. Although OpenGL is largely independent of the windowing characteristics of each operating system, the standard defines special glue routines to enable OpenGL to work in an operating system’s windowing environment. The Mac OS X implementation of OpenGL implements these glue routines to enable operation with the Quartz Compositor.
In Mac OS X v10.5 and later, OpenGL supports the ability to use multiple threads to process graphics data. OpenGL also supports pixel buffer objects, color managed texture images in the sRGB color space, support for 64-bit addressing, and improvements in the shader programming API. You can also attach an AGL context to WindowRef
and HIView
objects and thereby avoid using QuickDraw ports.
For information about using OpenGL in Mac OS X, see OpenGL Programming Guide for Mac OS X.
Introduced in Mac OS X v10.5, Core Animation is a set of Objective-C classes for doing sophisticated 2D rendering and animation. Using Core Animation, you can create everything from basic window content to Front Row–style user interfaces, and achieve respectable animation performance, without having to tune your code using OpenGL or other low-level drawing routines. This performance is achieved using server-side content caching, which restricts the compositing operations performed by the server to only those parts of a view or window whose contents actually changed.
At the heart of the Core Animation programming model are layer objects, which are similar in many ways to Cocoa views. Like views, you can arrange layers in hierarchies, change their size and position, and tell them to draw themselves. Unlike views, layers do not support event-handling, accessibility, or drag and drop. You can also manipulate the layout of layers in more ways than traditional Cocoa views. In addition to positioning layers using a layout manager, you can apply 3D transforms to layers to rotate, scale, skew, or translate them in relation to their parent layer.
Layer content can be animated implicitly or explicitly depending on the actions you take. Modifying specific properties of a layer, such as its geometry, visual attributes, or children, typically triggers an implicit animation to transition from the old state to the new state of the property. For example, adding a child layer triggers an animation that causes the child layer to fade gradually into view. You can also trigger animations explicitly in a layer by modifying its transformation matrix.
You can manipulate layers independent of, or in conjunction with, the views and windows of your application. Both Cocoa and Carbon applications can take advantage of the Core Animation’s integration with the NSView
class to add animation effects to windows. Layers can also support the following types of content:
Quartz Composer compositions
OpenGL content
Core Image filter effects
Quartz and Cocoa drawing content
QuickTime playback and capture
The Core Animation features are part of the Quartz Core framework (QuartzCore.framework
). For information about Core Animation, see Animation Overview.
Introduced in Mac OS X version 10.4, Core Image extends the basic graphics capabilities of the system to provide a framework for implementing complex visual behaviors in your application. Core Image uses GPU-based acceleration and 32-bit floating-point support to provide fast image processing and pixel-level accurate content. The plug-in based architecture lets you expand the capabilities of Core Image through the creation of image units, which implement the desired visual effects.
Core Image includes built-in image units that allow you to:
Correct color, including perform white-point adjustments
Apply color effects, such as sepia tone
Blur or sharpen images
Composite images
Warp the geometry of an image by applying an affine transform or a displacement effect
Generate color, checkerboard patterns, Gaussian gradients, and other pattern images
Add transition effects to images or video
Provide real-time control, such as color adjustment and support for sports, vivid, and other video modes
Apply linear lighting effects, such as spotlight effects
You define custom image units using the classes of the Core Image framework. You can use both the built-in and custom image units in your application to implement special effects and perform other types of image manipulations. Image units take full advantage of hardware vector units, Quartz, OpenGL, and QuickTime to optimize the processing of video and image data. Rasterization of the data is ultimately handled by OpenGL, which takes advantage of graphics hardware acceleration whenever it is available.
Core Image is part of the Quartz Core framework (QuartzCore.framework
). For information about how to use Core Image or how to write custom image units, see Core Image Programming Guide and Core Image Reference Collection. For information about the built-in filters in Core Image, see Core Image Filter Reference.
Introduced in Mac OS X v10.5, the Image Kit framework is an Objective-C framework that makes it easy to incorporate powerful imaging services into your applications. This framework takes advantage of features in Quartz, Core Image, OpenGL, and Core Animation to provide an advanced and highly optimized development path for implementing the following features:
Displaying images
Rotating, cropping, and performing other image-editing operations
Browsing for images
Taking pictures using the built-in picture taker panel
Displaying slideshows
Browsing for Core Image filters
Displaying custom views for Core Image filters
The Image Kit framework is included as a subframework of the Quartz framework (Quartz.framework
). For more information on how to use Image Kit, see Image Kit Programming Guide and Image Kit Reference Collection
QuickDraw is a legacy technology adapted from earlier versions of the Mac OS that lets you construct, manipulate, and display two-dimensional shapes, pictures, and text. Because it is a legacy technology, QuickDraw should not be used for any active development. Instead, you should use Quartz.
If your code currently uses QuickDraw, you should begin converting it to Quartz 2D as soon as possible. The QuickDraw API includes features to make transitioning your code easier. For example, QuickDraw includes interfaces for getting a Quartz graphics context from a GrafPort
structure. You can use these interfaces to transition your QuickDraw code in stages without radically impacting the stability of your builds.
Important: QuickDraw is deprecated in Mac OS X v10.5 and later. QuickDraw is not available for 64-bit applications.
Mac OS X provides extensive support for advanced typography for both Carbon and Cocoa programs. These APIs let you control the fonts, layout, typesetting, text input, and text storage in your programs and are described in the following sections. For guidance on choosing the best technology for your needs, see Getting Started with Text and Fonts.
Cocoa provides advanced text-handling capabilities in the Application Kit framework. Based on Core Text, the Cocoa text system provides a multilayered approach to implementing a full-featured text system using Objective-C. This layered approach lets you customize portions of the system that are relevant to your needs while using the default behavior for the rest of the system. You can use Cocoa Text to display small or large amounts of text and can customize the default layout manager classes to support custom layout.
Although part of Cocoa, the Cocoa text system can also be used in Carbon-based applications. If your Carbon application displays moderate amounts of read-only or editable text, you can use HIView
wrappers for the NSString
, NSTextField
, and NSTextView
classes to implement that support. Using wrappers is much easier than trying to implement the same behavior using lower-level APIs, such as Core Text, ATSUI, or MLTE. For more information on using wrapper classes, see Carbon-Cocoa Integration Guide.
For an overview of the Cocoa text system, see Text System Overview.
Introduced in Mac OS X v10.5, Core Text is a C-based API that provides you with precise control over text layout and typography. Core Text provides a layered approach to laying out and displaying Unicode text. You can modify as much or as little of the system as is required to suit your needs. Core Text also provides optimized configurations for common scenarios, saving setup time in your application. Designed for performance, Core Text is up to twice as fast as ATSUI (see “Apple Type Services for Unicode Imaging”), the text-handling technology that it replaces.
The Core Text font API is complementary to the Core Text layout engine. Core Text font technology is designed to handle Unicode fonts natively and comprehensively, unifying disparate Mac OS X font facilities so that developers can do everything they need to do without resorting to other APIs.
Carbon and Cocoa developers who want a high-level text layout API should consider using the Cocoa text system and the supporting Cocoa text views. Unless you need low-level access to the layout manager routines, the Cocoa text system should provide most of the features and performance you need. If you need a lower-level API for drawing any kind of text into a CGContext
, then you should consider using the Core Text API.
For more information about Core Text, see Core Text Programming Guide and Core Text Reference Collection.
Apple Type Services (ATS) is an engine for the systemwide management, layout, and rendering of fonts. With ATS, users can have a single set of fonts distributed over different parts of the file system or even over a network. ATS makes the same set of fonts available to all clients. The centralization of font rendering and layout contributes to overall system performance by consolidating expensive operations such as synthesizing font data and rendering glyphs. ATS provides support for a wide variety of font formats, including TrueType, PostScript Type 1, and PostScript OpenType. For more information about ATS, see Apple Type Services for Fonts Programming Guide.
Note: In Mac OS X v10.5 and later, you should consider using the Core Text font-handling API instead of this technology. For more information, see “Core Text.”
Apple Type Services for Unicode Imaging (ATSUI) is the technology behind all text drawing in Mac OS X. ATSUI gives developers precise control over text layout features and supports high-end typography. It is intended for developers of desktop publishing applications or any application that requires the precise manipulation of text. For information about ATSUI, see ATSUI Programming Guide.
Note: In Mac OS X v10.5 and later, you should consider using the Core Text API instead of this technology. For more information, see “Core Text.”
The Multilingual Text Engine (MLTE) is an API that provides Carbon-compliant Unicode text editing. MLTE replaces TextEdit and provides an enhanced set of features, including document-wide tabs, text justification, built-in scroll bar handing, built-in printing support, inline input, multiple levels of undo, support for more than 32 KB of text, and support for Apple Type Services. This API is designed for developers who want to incorporate a full set of text editing features into their applications but do not want to worry about managing the text layout or typesetting. For more information about MLTE, see Handling Unicode Text Editing With MLTE.
In Mac OS X v10.5 and later, the QuickDraw-related features of MLTE are deprecated. The features that use HITextView are still supported, however.
Note: In Mac OS X v10.5 and later, you should consider using the Core Text API instead of this technology. For more information, see “Core Text.”
Mac OS X includes support for high-quality audio creation and reproduction.
The Core Audio frameworks of Mac OS X offer a sophisticated set of services for manipulating multichannel audio. You can use Core Audio to generate, record, mix, edit, process, and play audio. You can also use Core Audio to generate, record, process, and play MIDI data using both hardware and software MIDI instruments.
For the most part, the interfaces of the Core Audio frameworks are C-based, although some of the Cocoa-related interfaces are implemented in Objective-C. The use of C-based interfaces results in a low-latency, flexible programming environment that you can use from both Carbon and Cocoa applications. Some of the benefits of Core Audio include the following:
Built-in support for reading and writing a wide variety of audio file and data formats
Plug-in interfaces for handling custom file and data formats
Plug-in interfaces for performing audio synthesis and audio digital signal processing (DSP)
A modular approach for constructing audio signal chains
Scalable multichannel input and output
Easy synchronization of audio MIDI data during recording or playback
Support for playing and recording digital audio, including support for scheduled playback and synchronization and for getting timing and control information
A standardized interface to all built-in and external hardware devices, regardless of connection type (USB, Firewire, PCI, and so on)
For an overview of Core Audio and its features, see Core Audio Overview. For reference information, see Core Audio Framework Reference.
Introduced in Mac OS X v10.4, the Open Audio Library (OpenAL) audio system adds another way to create audio for your software. The OpenAL interface is a cross-platform standard for delivering 3D audio in applications. It lets you implement high-performance positional audio in games and other programs that require high-quality audio output. Because it is a cross-platform standard, the applications you write using OpenAL on Mac OS X can be ported to run on many other platforms.
In Mac OS X v10.5, several features were incorporated into the existing OpenAL framework. Among these features are support for audio capture, exponential and linear distance models, location offsets, and spatial effects such as reverb and occlusion. In addition, more control is provided for some Core Audio features such as mixer sample rates.
Apple’s implementation of OpenAL is based on Core Audio, so it delivers high-quality sound and performance on all Mac OS X systems. To use OpenAL in a Mac OS X application, include the OpenAL framework (OpenAL.framework
) in your Xcode project. This framework includes header files whose contents conform to the OpenAL specification, which is described at http://www.openal.org.
For more information on the Mac OS X implementation of OpenAL, go to http://developer.apple.com/audio/openal.html.
The video technologies in Mac OS X allow you to work with movies and other time-based content, including audio.
QuickTime is a powerful multimedia technology for manipulating, enhancing, and storing video, sound, animation, graphics, text, music, and even 360-degree virtual reality content. It allows you to stream digital video, where the data stream can be either live or stored. QuickTime is a cross-platform technology, supporting Mac OS X, Mac OS 9, Windows 98, Windows Me, Windows 2000, Windows XP, and Windows Vista. Using QuickTime, developers can perform actions such as the following:
Open and play movie files
Open and play audio files
Display still images
Translate still images from one format to another
Compress audio, video, and still images
Synchronize multiple media to a common timeline
Capture audio and video from an external device
Stream audio and video over a LAN or the Internet
Create and display virtual reality objects and panoramas
For a long time, QuickTime has included programming interfaces for the C and C++ languages. Beginning with Mac OS X v10.4, the QuickTime Kit provides an Objective-C based set of classes for managing QuickTime content. For more information about QuickTime Kit, see “QuickTime Kit.”
Note: In Mac OS X v10.5 and later, you must use the QuickTime Kit framework to create 64-bit applications. The QuickTime C-based APIs are not supported in 64-bit applications.
QuickTime supports more than a hundred media types, covering a range of audio, video, image, and streaming formats. Table 3-2 lists some of the more common file formats it supports. For a complete list of supported formats, see the QuickTime product specification page at http://www.apple.com/quicktime/pro/specs.html.
Image formats | PICT, BMP, GIF, JPEG, TIFF, PNG |
Audio formats | AAC, AIFF, MP3, WAVE, uLaw |
Video formats | AVI, AVR, DV, M-JPEG, MPEG-1, MPEG-2, MPEG-4, AAC, OpenDML, 3GPP, 3GPP2, AMC, H.264 |
Web streaming formats |
The QuickTime architecture is very modular. QuickTime includes media handler components for different audio and video formats. Components also exist to support text display, Flash media, and codecs for different media types. However, most applications do not need to know about specific components. When an application tries to open and play a specific media file, QuickTime automatically loads and unloads the needed components. Of course, applications can specify components explicitly for many operations.
You can extend QuickTime by writing your own component. You might write your own QuickTime component to support a new media type or to implement a new codec. You might also write components to support a custom video capture card. By implementing your code as a QuickTime component that you enable, other applications take advantage of your code and use it to support your hardware or media file formats. See “QuickTime Components” for more information.
Introduced in Mac OS X version 10.4, the QuickTime Kit (QTKit.framework
), is an Objective-C framework for manipulating QuickTime-based media. This framework lets you incorporate movie playback, movie editing, export to standard media formats, and other QuickTime behaviors easily into your applications. The classes in this framework open up a tremendous amount of QuickTime behavior to both Carbon and Cocoa developers. Instead of learning how to use the more than 2500 functions in QuickTime, you can now use a handful of classes to implement the features you need.
In Mac OS X v10.5, support was added for capturing professional-quality audio and video content from one or more external sources, including cameras, microphones, USB and Firewire devices, DV media devices, QuickTime streams, data files, and the screen. The input and output classes included with the framework provide all of the components necessary to implement the most common use case for a media capture system: recording from a camera to a QuickTime file. Video capture includes frame accurate audio/video synchronization, plus you can preview captured content and save it to a file or stream.
Note: The QuickTime Kit framework supersedes the NSMovie
and NSMovieView
classes available in Cocoa. If your code uses these older classes, you should change your code to use the QuickTime Kit instead.
For information on how to use the QuickTime Kit, see QuickTime Kit Programming Guide and QTKit Capture Programming Guide. For reference information about the QuickTime Kit classes, see QTKit Framework Reference.
Introduced in Mac OS X version 10.4, Core Video provides a modern foundation for delivering video in your applications. It creates a bridge between QuickTime and the graphics card’s GPU to deliver hardware-accelerated video processing. By offloading complex processing to the GPU, you can significantly increase performance and reduce the CPU load of your applications. Core Video also allows developers to apply all the benefits of Core Image to video, including filters and effects, per-pixel accuracy, and hardware scalability.
In Mac OS X v10.4, Core Video is part of the Quartz Core framework (QuartzCore.framework
). In Mac OS X v10.5 and later, the interfaces are duplicated in the Core Video framework (CoreVideo.framework
).
For information about using the Core Video framework, see Core Video Programming Guide.
Mac OS X version 10.3 and later includes the DVD Playback framework for embedding DVD viewer capabilities into an application. In addition to playing DVDs, you can use the framework to control various aspects of playback, including menu navigation, viewer location, angle selection, and audio track selection. You can play back DVD data from disc or from a local VIDEO_TS
directory.
For more information about using the DVD Playback framework, seeDVD Playback Services Programming Guide.
ColorSync is the color management system for Mac OS X. It provides essential services for fast, consistent, and accurate color calibration, proofing, and reproduction as well as an interface for accessing and managing systemwide color management settings. It also supports color calibration with hardware devices such as printers, scanners, and displays.
Beginning with Mac OS X version 10.3, the system provides improved support for ColorSync. In most cases, you do not need to call ColorSync functions at all. Quartz and Cocoa automatically use ColorSync to manage pixel data when drawing on the screen. They also respect ICC (International Color Consortium) profiles and apply the system’s monitor profile as the source color space. However, you might need to use ColorSync directly if you define a custom color management module (CMM), which is a component that implements color-matching, color-conversion, and gamut-checking services.
For information about the ColorSync API, see ColorSync Manager Reference.
Printing support in Mac OS X is implemented through a collection of APIs and system services available to all application environments. Drawing on the capabilities of Quartz, the printing system delivers a consistent human interface and makes shorter development cycles possible for printer vendors. It also provides applications with a high degree of control over the user interface elements in printing dialogs. Table 3-3 describes some other features of the Mac OS X printing system.
For an overview of the printing architecture and how to support it, see Mac OS X Printing System Overview.
Mac OS X takes advantage of hardware wherever it can to improve performance wherever it can. In the case of repetitive tasks operating on large data sets, Mac OS X uses the vector-oriented extensions provided by the processor. (Mac OS X currently supports the PowerPC AltiVec extensions and the Intel x86 SSE extensions.) Hardware-based vector units boost the performance of any application that exploits data parallelism, such as those that perform 3D graphic imaging, image processing, video processing, audio compression, and software-based cell telephony. Quartz and QuickTime incorporate vector capabilities, thus any application using these APIs can tap into this hardware acceleration without making any changes.
In Mac OS X v10.3 and later, you can use the Accelerate framework (Accelerate.framework
) to accelerate complex operations using the available vector unit. This framework supports both the PowerPC AltiVec and Intel x86 SSE extensions internally but provides a single interface for you to use in your application. The advantage of using this framework is that you can simply write your code once without having to code different execution paths for each hardware platform. The functions of this framework are highly tuned for the specific platforms supported by Mac OS X and in many cases can offer better performance than hand-rolled code.
The Accelerate framework is an umbrella framework that wraps the vecLib and vImage frameworks into a single package. The vecLib framework contains vector-optimized routines for doing digital signal processing, linear algebra, and other computationally expensive mathematical operations. (The vecLib framework is also a top-level framework for applications running on versions of Mac OS X up to and including version 10.5.) The vImage framework supports the visual realm, adding routines for morphing, alpha-channel processing, and other image-buffer manipulations.
For information on how to use the components of the Accelerate framework, see vImage Programming Guide, vImage Reference Collection, and vecLib Framework Reference. For general performance-related information, see Reference Library > Performance.
© 2004, 2008 Apple Inc. All Rights Reserved. (Last updated: 2008-10-15)