Slide 12
Slide 12 text
Instant Neural Graphic Primitives の演算
各階層の解像度
𝑁!
≔ N"#$
* 𝑏!
𝑏 ≔ exp %& '!"#(%& '!$%
)(*
ハッシュ関数の定義
ℎ 𝕩 =
𝑑
⊕
𝑖 = 1
𝑥#
𝜋#
𝑚𝑜𝑑 𝑇
ただし、⊕はbitwise XORで、𝜋#
は𝜋* = 1を除き⼗分⼤きい素数
ハッシュの衝突対策は⾏わない
Fig. 3. Illustration of the multiresolution hash encoding in 2D. (1) for a given input c
assign indices to their corners by hashing their integer coordinates. (2) for all resultin
vectors from the hash tables \; and (3) linearly interpolate them according to the relat
result of each level, as well as auxiliary inputs b 2 R⇢, producing the encoded MLP in
gradients are backpropagated through the MLP (5), the concatenation (4), the linear in
Table 1. Hash encoding parameters and their typical values. Only the hash
table size ) and max. resolution #max need to be tuned to the use case.
Parameter Symbol Value
Number of levels ! 16
Max. entries per level (hash table size) ) [214, 224
]
Number of feature dimensions per entry 2
Coarsest resolution #min 16
Finest resolution #max [512, 524288]
3 MULTIRESOLUTION HASH ENCODING
Given a fully connected neural network<(y; ), we are interested in
an encoding of its inputs y = enc(x;\) that improves the approxima-
tion quality and training speed across a wide range of applications
without incurring a notable performance overhead. Our neural net-
work not only has trainable weight parameters , but also trainable
encoding parameters \. These are arranged into ! levels, each con-
taining up to) feature vectors with dimensionality . Typical values
b
eac
wh
grid
is 1
into
no
opt
sub
num
bou
W
wh
larg
12