of CS Pattern Recognition Statistical Classification Computer Vision Computational Geometry Databases etc. A Practical Example: Similar Picture Searching Theoretically, A Representation of high dimension problems
2016. Randomized Approximate Nearest Neighbor Search with Limited Adaptivity. In Proceedings of the 28th ACM Symposium on Parallelism in Algorithms and Architectures (SPAA '16). Best Paper Finalist of SPAA’16
memory one by one Parallelized Cell-Probe Model: s memory units used w size of memory unit k time cost in real world have to read t memory units in total
Cell-Probe Model: s memory units used w size of memory unit k time cost in real world have to read t memory units in total Access multiple unites at once
Cell-Probe Model: s memory units used w size of memory unit k time cost in real world have to read t memory units in total Access multiple unites at once
we have to find the exactly nearest neighbor or if we have to find the answer deterministically So called Curse of Dimensionality Toward removing the curse, we deal with NNS approximately and randomizedly.
approximately and randomizedly. if s = nO(1), w = dO(1) t = O ( k (log d ) 1/k ) t = ⌦( 1 k (log d ) 1/k ) t = O ( k + ( 1 k log d ) O(1/k) ) s = nO(1), w = O(d) if
approximately and randomizedly. if s = nO(1), w = dO(1) t = O ( k (log d ) 1/k ) t = ⌦( 1 k (log d ) 1/k ) t = O ( k + ( 1 k log d ) O(1/k) ) Tight if k is constant or t=k s = nO(1), w = O(d) if
approximately and randomizedly. if s = nO(1), w = dO(1) t = O ( k (log d ) 1/k ) t = ⌦( 1 k (log d ) 1/k ) t = O ( k + ( 1 k log d ) O(1/k) ) Tight if k is constant or t=k t = ⇥( log log d log log log d) s = nO(1), w = O(d) if
approximately and randomizedly. if s = nO(1), w = dO(1) t = O ( k (log d ) 1/k ) t = ⌦( 1 k (log d ) 1/k ) t = O ( k + ( 1 k log d ) O(1/k) ) Tight if k is constant or t=k s = nO(1), w = O(d) if
approximately and randomizedly. if s = nO(1), w = dO(1) t = O ( k (log d ) 1/k ) t = ⌦( 1 k (log d ) 1/k ) t = O ( k + ( 1 k log d ) O(1/k) ) Tight if k is constant or t=k Asymptotically tight in other case s = nO(1), w = O(d) if
Approximate Nearest Neighbor Search with Limited Adaptivity. In Proceedings of the 28th ACM Symposium on Parallelism in Algorithms and Architectures (SPAA '16). Best Paper Finalist of SPAA’16
A 91 ਃԔ୩ ḘظԆԎӨᐒտᑀොဩᦞ A 76 ᠗ හഝയ B 74 לಛ̵ୟڥ٠ ړୗᔮᕹ B 62 ̵ًຶӾ ᦇᓒቘᦞ B 76 ਟොභ ᕟݳහ D 97 ੭Ӟ᭗ ᵋᓒဩ D 97 ੭Ӟ᭗ ړୗᓒဩفᳪ D 85 Ἆਜ Ꭵᴣቘᦞٌ݊ଫአ D 90 ᩶ᰂᆦ ཛྷୗᦩڦ D 88 ޓୌᰎ ݢᦇᓒӨݢڣෙ D 84 ࡎᜉ
lower bound for λNN was given in 2006 - Trying to prove a better lower bound - λNN is a decision problem - Any better result will be the highest lower bound among any decision problem