Slide 32
Slide 32 text
32
References
[Neklyudov ’23] Neklyudov, K., Brekelmans, R., Severo, D., & Makhzani, A. (2023). Action Matching: Learning Stochastic Dynamics from Samples.
[Chen ’21] Chen, T., Liu, G. H., & Theodorou, E. (2021, October). Likelihood Training of Schrödinger Bridge using Forward-Backward SDEs
Theory. In International Conference on Learning Representations.
[Liu ’22] Liu, G. H., Chen, T., So, O., & Theodorou, E. (2022). Deep generalized Schrödinger bridge.Advances in Neural Information Processing
Systems, 35, 9374-9388.
[Wang ’21] Wang, G., Jiao, Y., Xu, Q., Wang, Y., & Yang, C. (2021, July). Deep generative learning via schrödinger bridge. In International Conference
on Machine Learning (pp. 10794-10804). PMLR.
[Zhang ’22] Zhang, Q., & Chen, Y. (2022). PATH INTEGRAL SAMPLER: A STOCHASTIC CONTROL APPROACH FOR SAMPLING. Proceedings
of Machine Learning Research.
[Gushchin. ’22] Gushchin, N., Kolesov, A., Korotin, A., Vetrov, D., & Burnaev, E. (2022). Entropic neural optimal transport via diffusion
processes. arXiv preprint arXiv:2211.01156.
[Somnath ’23]Somnath, V. R., Pariset, M., Hsieh, Y. P., Martinez, M. R., Krause, A., & Bunne, C. (2023). Aligned Diffusion Schrö
dinger Bridges. arXiv preprint arXiv:2302.11419.
[Tong ’23] Tong, A., Malkin, N., Huguet, G., Zhang, Y., Rector-Brooks, J., Fatras, K., ... & Bengio, Y. (2023, July). Improving and generalizing flow-based
generative models with minibatch optimal transport. In ICML Workshop on New Frontiers in Learning, Control, and Dynamical Systems.
[Tong ’23] Tong, A., Malkin, N., Fatras, K., Atanackovic, L., Zhang, Y., Huguet, G., ... & Bengio, Y. (2023). Simulation-free Schrödinger bridges via
score and flow matching. arXiv preprint arXiv:2307.03672.