scene ! determine its corresponding pixels ! for each corresponding pixel ! ! calculate depth of object at pixel ! ! if depth is closer than any object yet seen ! ! ! put this object in the pixel ! ! endif ! end end The GPU implements a z-buffer algorithm in hardware
at the same time that depth comparison is performed. Together these are the core activities of computer graphics. They are the raison d'être of the GPU. Compute intensive, Highly optimized and parallelized. 16
we apply a transformation via a transformation matrix. This transformation is defined in terms of a coordinate frame. A specific location on the transformed object can be interpreted in terms of any coordinate frame we choose.
frame transformation is not only motivated by the need to pose a model or aim a camera as in the physical world. In computer graphics land it often has to do with the convenience of performing a particular calculation in one space rather then another.
referring to the ES1 “fixed-function” transformation pipeline provided by the GPU. In ES2 (GLSL) the transformation pipeline must be handled entirely in application space. 41
push/pop a transformation matrix on/off the transformation stack. This stack is the key data structure for presenting a hierarchical model to the GPU in fixed-function OpenGL.
matrix and push it on the stack. Subsequent matrix concatenation is done with stack.top. stack.top is the current coordinate frame. glPopMatrix - Pop the stack. The pre-existing matrix is now stack.top. We revert to the coordinate frame the existed prior to the push. ! glPushMatrix(); ! glMultMatrixf(_cameraTransform); ! ! glEnable(GL_LIGHT3); ! glLightfv(GL_LIGHT3, GL_DIFFUSE, spotLight); ! glPopMatrix();!
You may be surprised to learn there is no concept of camera in OpenGL. Camera posing is equivalent to inverting the camera transform and applying it to the scene observed by the camera.
a collection of triangle stripes that combine to form the teapot surface. short indicies[] = { // how many vertices in vertex strip 26, // vertex strip indices 1122, 1243, 1272, 1242, ... ,1283, 1199, ... }; 64
field of computer graphics by Pixar with its rendering API and shading language called RenderMan. A shader is a small functions evaluated at every location on a being rendered. GLSL borrows heavily from the RenderMan model. 76
lets get a lay of the land. There are two flavors of shaders in GLSL. A vertex shader is evaluated at each vertex. A fragment shader is evaluated at each screen space pixel corresponding to a sampled on the facet being rasterized.
together in a pipeline fashion. Vertex attributes - color, surface normal, texture coordinate - are evaluated in the vertex shader then passed on to the fragment shader where those values are interpolated across the surface.
attribute varying vec2 v_st - Varies across the surface uniform sampler2D myTexture_0 - Invariant throughout rendering cycle Shader variables come in different flavors
v_st; void main() { // Visualize the s-t parameterization of the underlying surface gl_FragColor.r = v_st.s; gl_FragColor.g = v_st.t; gl_FragColor.b = 0.0; gl_FragColor.a = 1.0; } The texture coordinate values v_st.s and v_st.t are for the red and green channels of the shader color.
attribute varying vec2 v_st - Varies across the surface uniform sampler2D myTexture_0 - Invariant throughout rendering cycle Shader variables come in different flavors
wire our iOS app together with our shaders. We will use a texture shading example. https://github.com/turner/HelloShader/blob/master/HelloShader/Classes/Renderer/GLRenderer.m
object instance ... TEITexture *t = (TEITexture *)[self.rendererHelper.renderables objectForKey:@"texture_0"]; t.location = glGetUniformLocation(m_program, "myTexture_0"); glActiveTexture( GL_TEXTURE0 ); glBindTexture(GL_TEXTURE_2D, t.name); glUniform1i(t.location, 0); Activate a texture unit, bind it to a texture object, and assign a number to the corresponding texture sampler used in the fragment shader
a surface and a texture we can “attach” the texture to the surface. The interpretation of the a can go far beyond that of a decal to be applied to a surface. Bumps, opacity, displacement, and much more can be designed with a texture.
to the screen refresh rate. The actual rendering is handled via a selector. - (void)startAnimation { if (!self.isAnimating) { self.displayLink = [NSClassFromString(@"CADisplayLink") displayLinkWithTarget:self selector:@selector(drawView:)]; [self.displayLink setFrameInterval:animationFrameInterval]; [self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; self.animating = YES; } // if (!self.isAnimating) }
! [EAGLContext setCurrentContext:m_context]; glBindFramebuffer(GL_FRAMEBUFFER, m_framebuffer); // transform, light, shade, etc. glBindRenderbuffer(GL_RENDERBUFFER, m_colorbuffer); [m_context presentRenderbuffer:GL_RENDERBUFFER]; } The render loop. Draw to the colorbuffer then present to the display. This is the classic “ping pong” between back buffer and front buffer.