V M P L L M ：ビュー射影行列 V ：頂点 ' P y y y x x x P GetWarpY P P P GetWarpX P P ' ' はクリップ空間なので(-1.0 ～ +1.0 ) y x P P , GetWarp メソッドはテクスチャ空間なので(0.0 ～1.0) GetWarpY GetWarpX, はクリップ空間に変換する必要あり ※図は，Paul Rosen, “Rectilinear texture warping for fast adaptive shadow maps”, i3D 2012 より引用
Analysisの場合 カメラからの視点で解析しているので，描画イメージがある状態。 シャドウのみの計算が必要で，画像の上に合成する。 • シャドウマップのテクスチャ座標は，ワーピングによって求める。 t GetWarpT t t s GetWarpS s s ' ' ※図は，Paul Rosen, “Rectilinear texture warping for fast adaptive shadow maps”, i3D 2012 より引用
“Smooth shadow boundaries with exponentially warped Gaussian filtering”, Computers and Graphics, 37(3):214–224, 2013. In this paper we propose a new statistical filtering method that approximates the cumulative distribution function (CDF) of depth values by a Gaussian CDF instead of bounding it with Chebyshev Inequality. This approximation significantly reduces ‘‘light leaks’’ and has similar performance and storage requirements compared to the original variance shadow map method. We also show that the combination of this technique with an exponential warp allows us to further reduce the remaining shadowing artifacts from the rendered image. ・累積分布関数を使うらしい… ・まだちゃんと論文読めていないです。 ・きちんと調査して，BlogにUpしたいな。 （希望的観測） ・ライトブリーディングが削除される。 ・速度はEVSMよりも遅い。
Virtual Shadow Maps for Many Lights” http://www.slideshare.net/omochi64/ss-47145338?qid=bd52bda4-2f71-4290-90ae-51d5ce6cab94&v=&b=&from_search=1 ※図は，Ola Olsson, Erik Sintorn, et.al, “Efficient Virtual Shadow Maps for Many Lights”, i3D 2014より引用
This paper describes an algorithm for rendering soft shadows efficiently by generalizing conventional triangle projection and rasterization from 2D to 4D. The rectangular area light source is modeled with a point light source that translates with two degrees of freedom. This generalizes the projection of triangles and of output image samples, as seen from the light, to the locus of projections as the light translates. The generalized projections are rasterized to determine a conservative set of sample/ triangle pairs, which are then examined to derive light occlusion masks for each sample. The algorithm is exact in the sense that each element of the occlusion mask of a sample is computed accurately by considering all potentially blocking triangles. The algorithm does not require any type of precomputation so it supports fully dynamic scenes. We have tested our algorithm on several scenes to render complex soft shadows accurately at interactive rates. • “Fast Shadow Map Rendering for Many Lights Settings”, ESGR 2016 In this paper we present a method to efficiently cull large parts of a scene prior to shadow map computations for many- lights settings. Our method is agnostic to how the light sources are generated and thus works with any method of light distribution. Our approach is based on previous work in culling for ray traversal to speed up area light sampling. • “Filtering Multilayer Shadow Maps for Accurate Soft Shadows”, EG 2016 In this paper, we introduce a novel technique for pre-filtering multi-layer shadow maps. The occluders in the scene are stored as variable-length lists of fragments for each texel. We show how this representation can be filtered by progressively merging these lists. In contrast to previous pre-filtering techniques, our method better captures the distribution of depth values, resulting in a much higher shadow quality for overlapping occluders and occluders with different depths. The pre-filtered maps are generated and evaluated directly on the GPU, and provide efficient queries for shadow tests with arbitrary filter sizes. Accurate soft shadows are rendered in real-time even for complex scenes and difficult setups. Our results demonstrate that our pre-filtered maps are general and particularly scalable.