Basics of the Media Layer for iOS App Development

By Rajiv Ramnath

The Media layer in iOS apps contains support for graphics, audio, and video technologies. This layer has the following components which allow you to manage music and movie files, and so on along with their metadata.

  • Core Graphics (also known as Quartz): Natively handles 2D vector- and image-based rendering.

  • Core Animation: Provides support for animating views and other content. This is also a part of Quartz.

  • Core Image: Provides support for manipulating video and still images.

  • OpenGL ES and GLKit components: Provide support for 2D and 3D rendering using hardware-accelerated interfaces.

  • Core Text: Provides a text layout and rendering engine.

  • Image I/O: Provides interfaces for reading and writing most image formats.

  • Assets Library: Provides access to the photos and videos in the user’s photo library.

A MIDI interface is provided for connection with musical instruments.

Integrated record and playback of audio is provided as follows:

  • Through a media player that allows you to manipulate iTunes playlists

  • Via lower-level components for

    • Managing audio playback and recording

    • Managing positional audio playback (such as surround sound)

    • Playing system alert sounds

    • Vibrating a device

    • Buffering streamed audio content

    • Airplay streaming

Video services provided include playing movie files from your application or streaming them from the network and capturing video and incorporating it into your application. Once again, this functionality is provided in several ways: from a high-level media player to lower-level components that give you fine-grained control.

Image handling operations include creation, display and storage of pictures, and filters and feature detection.

Also, this layer is the one that provides support for text and font handling — such as layout and rendering.