Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Shaders in Unity3D

Shaders in Unity3D

From an internal presentation I had at our company.

Covering:
- Programmable shading pipeline
- Vertex buffers
- Uniforms / shader constants
- Shader semantics
- Unity Cg shaders
- Unity shader properties
- #pragma multi_compile
- Debugging shaders using RenderDoc

Matias Lavik

October 19, 2018
Tweet

More Decks by Matias Lavik

Other Decks in Programming

Transcript

  1. The good old days - Send vertex data to the

    GPU, and specify settings such as colour and fog. - Many limitations - less low-level control - Some APIs had “high level” concepts, such as sprites PlayStation 1 “PSYQ” SDK OpenGL (fixed function pipeline)
  2. Programmable shading pipeline - More control through the use of

    shaders - Vertex shader: modify vertex positions - Fragment shader: modify output colour - Newer features: Tesselation shaders and geometry shaders - Allows you to add screen-space effects by first rendering scene to texture OpenGL 2.0 shading pipeline
  3. Vertex - Wikipedia: “a point where two or more curves,

    lines, or edges meet” - Usually: A point (and its position, normal, texCoord, etc.) in a triangle • Position • Normal • Texture coordinate • Tangent / Bitangent
  4. Vertex buffer - Buffered vertex data on the GPU -

    Vertex data is created on the CPU and then uploaded to the video device Vertex layout - Order of the vertex attributes/components (position, normal, texcoord) - Each attribute can have its own buffer (slow) or be on the same buffer - One buffer per attribute: (VVVV) (NNNN) (CCCC) - Blocks in a batch: (VVVVNNNNCCCC) - Interleaved: (VNCVNCVNCVNC) (“stride” = byte offset between attributes)
  5. Uniforms / shader constants - OpenGL: “Uniform” ≈ DirectX: “Shader

    constant” - Per-material data sent to shaders - Vertex data is per vertex - uniforms are per material - Examples: material properties (colour, smoothness, specular reflectiveness), light sources, cross-section plane - In Unity, these are called “properties”, and you can set their value using Material::SetFloat(...) / Material::SetInt(...), etc.
  6. Rendering Create vertex data (array of vertices) Create vertex buffer

    (send vertices to GPU) Bind vertex buffer and index buffer, and draw From Ming3D: https://github.com/mlavik1/Ming3D
  7. Problems - Many rendering APIs: OpenGL, DirectX, Vulkan, GNM (PS4),

    Metal - Each rendering API has its own shader language - GLSL (OpenGL), HLSL (DirectX) - Need to support several rendering APIs and shader languages, and in some cases several versions of them Solution: Make your own shader language and convert it to GLSL, HLSL, etc.. Unity has their own shader language (based on Nvidia’s Cg)
  8. Name of shader (and path) Properties (textures and uniforms /

    shader constants) Contains a texture with name “MainTex”
  9. Various features - Math functions - sin(x), cos(x), tan(x) -

    Standard library functions - lerp(a, b, t) - smoothstep(a, b, t) - clamp(x, a, b) - length(v) - tex2D(texture, texCoord) - http://developer.download.nvidia.com/CgTutorial/cg_tutorial_appendix_e.html - Built-in shader variables - _Time: Time since level load (t/20, t, t*2, t*3) - https://docs.unity3d.com/Manual/SL-ShaderPrograms.html
  10. Shader semantic - MSDN: “A semantic is a string attached

    to a shader input or output that conveys information about the intended use of a parameter” - Unity needs to know which attributes in the vertex layout are position, normal, etc. (so it can buffer your mesh correctly) - Some rendering APIs require semantics on all input/output data - Vertex input/output: POSITION, TEXCOORD0, TEXCOORD1, NORMAL, COLOR, TANGENT - Fragment shader output: SV_Target - Multiple render targets: SV_Target0, SV_Target1,..
  11. Shader properties - Syntax: _PropertyName(“visual name”, type) = value -

    Types: - “int” - “Vector” - “Color” - “2D” (texture)
  12. Including - You can split shader into several files, by

    putting common function in a .cginc-file - Unity’s standard shader functions are in: - UnityStandardCore.cginc - UnityStandardCoreForward.cginc - UnityStandardShadow.cginc - UnityStandardMeta.cginc Unity shader includes location: Program Files\Unity\Editor\Data\CGIncludes
  13. Multicompiler shader program variants - If you want to enable/disable

    a set of features in a shader, without passing a boolean uniform and checking its value, you can use multicompile program variants 1. Add this after CGPROGRAM: #pragma multi_compile __ YOUR_DEFINE 2. Use #if YOU_DEFINE_HERE to conditionally enable/disable feature 3. Enable feature with: material.EnableKeyword("YOUR_DEFINE"); 4. Disable feature with: material.DisableKeyword("YOUR_DEFINE"); This will create two versions of the shader: One where the the “YOUR_DEFINE” preprocessor definition is defined, and one where it is not. The #if-check is done at compile time (or when shader is converted) Use shader_feature for multicompile definitions that will only be set in the material
  14. Debugging - Unity has RenderDoc integrations - RenderDoc allows you

    to capture a frame, see all render API calls, visualise input/output if each shader pass, visualise textures, inspect material properties (uniforms / shader constants) and much more. - See: https://docs.unity3d.com/Manual/RenderDocIntegration.html - Alternatively use the Visual Studio shader debugger, which allows you to add breakpoints, step through code and more: https://docs.unity3d.com/Manual/SL-DebuggingD3D11ShadersWithVS.ht ml
  15. 1. Download RenderDoc: https://renderdoc.org/builds 2. Include #pragma enable_d3d11_debug_symbols in your

    shader’s CGPROGRAM block, if you want to see property names and more. 3. Right-click on “Game” tab and load RenderDoc 4. While in-game, capture a frame