The “OpenGL ES Application” template in Xcode is quite illuminating. You can generate it from the Xcode menu: “File” -> “New Project” -> “OpenGL ES Application”. It will build and run as is, and draw an oscillating colored square in the Simulator. The source code isn’t long, but there’s a lot to look at. Some highlights follow.
Contexts and Layers
OpenGL and UIKit/Quartz 2D represent completely different rendering paradigms, and they’ve got to be glued together somehow. That glue is largely represented by the CAEAGLLayer
and EAGLContext
classes. CAEAGLLayer
serves as an “outbound bridge”, exporting the results of OpenGL ES rendering into the Core Animation rendering pipeline, while EAGLContext
is an “inbound bridge” that accepts OpenGL rendering commands from an iOS application.
The template’s EAGLView
class wraps a CAEAGLLayer
, and its view controller’s awakeFromNib
method sets up an EAGLContext
, which is used throughout the demo.
1.1 vs 2.0
There are (currently!) 2 versions of OpenGL ES available on iOS: 1.1 and 2.0, which are very, very different. OpenGL ES 1.1 supports a “fixed function pipeline” (FFP), while 2.0 supports a “shader pipeline” (SP). An FFP runs input data through a predetermined set of processing machinery, which can be configured in a number of specified ways. An SP applies essentially arbitrary “shader” functions to its input data. These functions are written in a C-like language called the OpenGL ES Shading Language, or GLSL ES.
The template’s view controller does a good job of illustrating simple rendering using both pipelines; pay particular attention to just how different they are. Supporting both is probably no one’s idea of a first choice.
FWIW, the 1.1 FFP is easier to work with and supported by all iOS devices, but less powerful. The 2.0 SP is considerably more flexible, but only available on some devices (basically the iPhone 3GS and later) and somewhat harder to get started with.
GLSL ES
The template includes the Shader.fsh
and Shader.vsh
files; these are fragment and vertex shaders, respectively. Note that these files are not compiled by Xcode; they are packed into the application’s bundle, then extracted and compiled at run time by the view controller’s loadShaders
method.
Timers
The template’s animation is driven by a CADisplayLink
object. CADisplayLink
instances automagically fire a target/action when the device needs a new frame to display. They’re a little opaque, but seem to represent the iOS version of vsync. The view controller sets up and tears down CADisplayLinks
in startAnimation
and stopAnimation
.
Note that the view controller creates the CADisplayLink
with the displayLinkWithTarget:selector:
method, which retains the target. To avoid a retain cycle, the view controller itself maintains only a weak reference to the CADisplayLink
.
Transforms
It’s a small point, but if you look at the view controller’s drawFrame
method, you’ll note that the “square’s” vertices are actually rectangular; they describe a rectangle 1 unit wide by 0.66 units high. Nevertheless, the object appears square when displayed. Why is this?
The answer lies in the physical properties of the display; iPhone screens have a 2:3 width:height ratio (e.g., 320 x 480 square pixels). Absent other transformations, the point (-1, -1)
will be mapped to the lower-left corner of the display, and the point (1, 1)
to the upper right. Since the physical display is rectangular, this will “squeeze” the image horizontally, transforming the given points into a square.
Naturally, this issue is handled more rigorously in production code.
In Conclusion
OpenGL ES (especially 2.0) can be a big pile of strange the first time you look at it, so it was nice of AAPL to provide a template that not only takes care of all the mechanics needed to get something on the screen, but that also effectively illustrates some of the surprising things in the API. Xcode’s “OpenGL ES Application” template is really a nice little tutorial, and will reward your study.
Pingback: OpenGL ES 1.1 Demo | Things that were not immediately obvious to me