Upgrade to Pro — share decks privately, control downloads, hide ads and more …

A Gentle Introduction to OpenGL ES on iOS

A Gentle Introduction to OpenGL ES on iOS

OpenGL ES is one of the most intimidating technology stacks on mobile platforms today. Outside of audio/video processing, there isn’t much with the same performance demands, intractable troubleshooting, and deeply entrenched knowledge. Yet the introductory texts available online tend to be of two varieties: “Introduction to OpenGL” and “OpenGL ES If You’re An OpenGL Expert.”

This session aims to introduce you to the wonderful world of OpenGL ES programming without assuming any non-ES OpenGL knowledge. We’ll stick to iOS (hey, the session is only 50 minutes long) and look at the fundamentals of how OpenGL works, how we can use Apple’s GLKit framework to streamline development, and how awesome the OpenGL debugging tools in Xcode are (really). When we’re done you’ll have seen how to draw a basic scene on iOS, and at the very least know if you want to pursue learning more or run away screaming.

Jeff Kelley

May 30, 2014
Tweet

More Decks by Jeff Kelley

Other Decks in Programming

Transcript

  1. A Gentle Introduction
    to OpenGL ES on iOS
    Jeff Kelley
    Self.conference, May 30th, 2014
    https://github.com/SlaunchaMan/IntroToOpenGLESCode

    View full-size slide

  2. OpenGL
    • Cross-Language, Cross-Platform 2D/3D
    Graphics Engine
    • OS X, Windows, Linux, etc.
    • OpenGL ES—“embedded systems”
    • A subset of OpenGL for low-powered devices
    • iOS, Android, WebGL, PlayStation 3

    View full-size slide

  3. What does OpenGL Do?
    • Provides a common set of APIs for rendering graphics
    • 3D graphics arranged into triangles in 3D space
    • OpenGL can do quads and polygon primitives, but not
    ES
    • Platforms are responsible for providing some integration
    points, but the APIs are mostly the same across platforms
    • Apple provides convenience APIs around OpenGL ES
    to make common tasks easier

    View full-size slide

  4. What’s the Difference Between
    OpenGL and OpenGL ES?
    • OpenGL ES is a subset of OpenGL
    • Redundant functionality removed
    • More restricted set of APIs to avoid using too
    much power

    View full-size slide

  5. Why OpenGL?
    • Graphics hardware is massively parallel for
    floating-point calculations
    • Can do hundreds or thousands of concurrent
    calculations with 3D data
    • Lets you get very close to the metal on the
    graphics card, allowing for huge performance
    wins

    View full-size slide

  6. How Does it Work?
    • OpenGL ES is a state machine
    • Set options you want, perform drawing
    commands, receive output
    • Whatever options you enable/disable stay that
    way until you say otherwise
    • After enabling options, you pass in coordinates
    in 3D space to be drawn, and you have an
    image!

    View full-size slide

  7. Where Does The Image Go?
    • Most of the time in OpenGL you’re drawing to
    something on the screen
    • Every OpenGL command must be made with an
    active context, which is one of those things the
    platform must provide
    • On iOS, that’s EAGLContext
    • Once you have an active context, you need to provide
    a framebuffer, which is where you’ll receive the image
    output

    View full-size slide

  8. What is a Framebuffer?
    • Basically, just a reservoir of
    pixels
    • Let’s look into the image…

    View full-size slide

  9. 00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011
    00110011 00110011 00110011 00110011 0011

    View full-size slide

  10. 00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011
    00110011 00110011 00110011 00110011 00110011 00110011

    View full-size slide

  11. Framebuffers
    • A framebuffer is the reservoir of pixels you saw last slide
    • Usually, on-screen
    • On iOS, usually in a GLKView
    • Can also be off-screen
    • Use OpenGL to render to a file
    • Render to a texture, then use that texture in a subsequent render
    (for reflections, etc.)
    • More on texturing later
    • One of the things that a platform must provide

    View full-size slide

  12. Animating
    • Often you will want to animate your drawing (e.g. if
    you have 3D toppings falling on a 3D pizza)
    • The naïve way:
    while (true) {
    [self clearFramebuffer];
    [self drawEverything];
    sleep(1.0/60.0);
    }

    View full-size slide

  13. Animating
    • Often you will want to animate your drawing (e.g.
    if you have 3D toppings falling on a 3D pizza)
    • Another naïve way:
    [NSTimer scheduledTimerWithTimeInterval:1.0 / 60.0
    target:self
    selector:@selector(draw)
    userInfo:nil
    repeats:YES];

    View full-size slide

  14. CADisplayLink
    • What you really want to do is synchronize your
    drawing calls with the refresh rate of the display
    • CADisplayLink offers this on iOS
    • + (CADisplayLink *)displayLinkWithTarget:(id)target

    selector:(SEL)sel;
    • Calls that selector every time the screen
    refreshes

    View full-size slide

  15. GLKViewController
    • Apple provides a view controller in GLKit that
    provides a view (GLKView) with a framebuffer
    and a rendering callback synchronized to the
    display
    - (void)glkView:(GLKView *)view

    drawInRect:(CGRect)rect;

    View full-size slide

  16. So How Do I Draw?
    • OpenGL draws a series of shapes (polygons) for 3D
    effects
    • Also supported: lines and points for 2D graphics
    • Triangles are passed to the state machine as a list of
    points in space:
    static const GLfloat coordinates[3][3] = {
    { -0.75f, -0.75f, 0.0f },
    { 0.75f, -0.75f, 0.0f },
    { 0.75f, 0.75f, 0.0f },
    };

    View full-size slide

  17. So How Do I Draw?
    • Before you draw, you need to pass your points
    to the context:
    glVertexAttribPointer(GLKVertexAttribPosition,
    kVertexCount,
    GL_FLOAT,
    GL_FALSE,
    sizeof(GL_FLOAT) * kCoordinatesPerVertex,
    &coordinates);
    !
    • What we’re actually sending is the address
    where it can find the points, and telling it how
    they’re stored at that address

    View full-size slide

  18. So How Do I Draw?
    • Once you tell the context where the points are,
    it’s as easy as telling it to draw them:
    glDrawArrays(GL_TRIANGLES, 0, kVertexCount);

    View full-size slide

  19. …Almost.
    • Just sending the points isn’t enough, we need to
    set up some more state
    • We need a way to translate points to the screen
    coordinates and a way to know what color to draw
    • GLKBaseEffect to the rescue!
    • Another Apple-provided part of GLKit that
    encapsulates this task and provides
    convenience methods

    View full-size slide

  20. Creating Our Effect
    self.effect = [[GLKBaseEffect alloc] init];
    !
    self.effect.constantColor =

    GLKVector4Make(1.0f, 1.0f, 1.0f, 1.0f);

    View full-size slide

  21. The Entire Drawing
    Command
    glClear(GL_COLOR_BUFFER_BIT);
    !
    glEnableVertexAttribArray(GLKVertexAttribPosition);
    !
    glVertexAttribPointer(GLKVertexAttribPosition,
    kVertexCount,
    GL_FLOAT,
    GL_FALSE,
    sizeof(GL_FLOAT) * kCoordinatesPerVertex,
    &coordinates);
    !
    [self.effect prepareToDraw];
    !
    glDrawArrays(GL_TRIANGLES, 0, kVertexCount);
    !
    glDisableVertexAttribArray(GLKVertexAttribPosition);

    View full-size slide

  22. Triangle Demo

    View full-size slide

  23. GLKBaseEffect
    • GLKit Effects provide the functionality that is
    ordinarily done via shaders: textures, fog,
    lighting, etc.
    • They provide some of the features of OpenGL
    ES 1 that were removed for OpenGL ES 2
    • For many applications, GLKit provides enough
    that you don’t need to write your own shaders

    View full-size slide

  24. Shaders
    • Eventually, you will need to write your own shaders if you need
    to use features not provided in GLKBaseEffect
    • Some things you might need to do that aren’t provided:
    • Bump Mapping
    • More than two textures per object
    • More than three lights
    • Custom Lighting Effects
    • Ragdoll Physics

    View full-size slide

  25. Shaders
    • Shaders are the part of the OpenGL ES pipeline
    that sit between a list of points and image data
    • Every point is run through a vertex shader, and
    the output is another point in the screen’s
    coordinate space
    • Every point inside the triangle on the screen is
    then run through the fragment shader to
    determine what color it should be

    View full-size slide

  26. OpenGL ES Code

    View full-size slide

  27. Vertex Shader

    View full-size slide

  28. Fragment Shader

    View full-size slide

  29. Shaders
    • Shaders are programs written in a C-like language called
    GLSL (for GL Shading Language)
    • The source code is loaded off of disk and compiled for
    the graphics card of the current device
    • Some implementations can support already-compiled
    binary shaders, but iOS does not currently support
    them
    • Shaders are linked together to make a program
    • GLKBaseEffect creates shaders for you as needed

    View full-size slide

  30. Vertex Shaders
    • Vertex shaders allow you to specify where on-
    screen the coordinates of your polygons are
    drawn:
    attribute vec3 a_position;
    attribute mat4 a_mvpMatrix;
    !
    uniform mat4 u_projectionMatrix;
    !
    void main()
    {
    v_position = vec3(a_mvpMatrix * vec4(a_position, 1.0));
    gl_Position = u_projectionMatrix * vec4(v_position, 1.0);
    }

    View full-size slide

  31. Fragment Shaders
    • Once the vertex shader runs, you have a list of
    points on your screen making up a triangle
    • The fragment shader runs for each pixel in that
    shape and outputs a color
    • Used for putting textures on objects, computing
    lighting, etc.
    • Extremely performance-intensive—run millions of
    times per frame

    View full-size slide

  32. Fragment Shaders
    varying vec4 diffuse,ambient;
    varying vec3 normal,halfVector;
    !
    void main()
    {
    vec3 n,halfV,lightDir;
    float NdotL,NdotHV;
    lightDir = vec3(gl_LightSource[0].position);
    /* The ambient term will always be present */
    vec4 color = ambient;
    /* a fragment shader can't write a varying variable, hence we need
    a new variable to store the normalized interpolated normal */
    n = normalize(normal);
    /* compute the dot product between normal and ldir */
    NdotL = max(dot(n,lightDir),0.0);
    if (NdotL > 0.0) {
    color += diffuse * NdotL;
    halfV = normalize(halfVector);
    NdotHV = max(dot(n,halfV),0.0);
    color += gl_FrontMaterial.specular *
    gl_LightSource[0].specular *
    pow(NdotHV, gl_FrontMaterial.shininess);
    }
    gl_FragColor = color;
    }
    lighthouse3d.com/tutorials/glsl-tutorial/directional-light-per-pixel/

    View full-size slide

  33. Color Shader Demo

    View full-size slide

  34. Varying Variables
    • OpenGL ES automatically interpolates
    between values in the vertices
    • Good for colors, lighting, or anything that
    changes across the surface of an object
    • Since fragment shaders are so
    performance-constrained, use that to your
    advantage and save computation for
    places like varying variables

    View full-size slide

  35. Texturing
    • 3D points on your object correspond to 2D texture coordinates
    • Newer versions of OpenGL ES can use 3D textures, but mostly
    you’ll use 2D
    • Limited amount of resources for textures, dependent on the device
    • Larger devices have much more memory
    • GLKTextureLoader converts from a file to an OpenGL texture you
    can use with your context
    • Apple ships a tool called texturetool for compressing images
    into a native OpenGL format

    View full-size slide

  36. More Resources

    View full-size slide