Learning,” J. Société Mathématiques Fr., 2019. - Y. Jiang, B. Neyshabur, D. Krishnan, H. Mobahi, and S. Bengio, “Fantastic Generalization Measures and Where to Find Them,” in ICLR, 2020. - S. M. Kakade, K. Sridharan, and A. Tewari, “On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization,” in NeurIPS, 2008. - A. Majumdar and M. Goldstein, “PAC-Bayes Control: Synthesizing Controllers that Provably Generalize to Novel Environments,” in CoRL, 2018. - D. A. McAllester, “Some PAC-Bayesian Theorems,” in COLT, 1998, pp. 230–234. - D. McNamara and M.-F. Balcan, “Risk Bounds for Transferring Representations With and Without Fine-Tuning,” in ICML, 2017, pp. 2373–2381. - T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean, “Distributed Representations of Words and Phrases and their Compositionality,” in NeurIPS, 2013, pp. 3111–3119. - M. Mohri, A. Rostamizadeh, and Ameet Talwalkar. Foundations of Machine Learning (2nd. ed.), The MIT Press, 2018. - B. Neyshabur, S. Bhojanapalli, and N. Srebro, “A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks,” in ICLR, 2018. - D. Reeb, A. Doerr, S. Gerwinn, and B. Rakitsch, “Learning Gaussian Processes by Minimizing PAC-Bayesian Generalization Bounds,” in NeurIPS, 2018. - M. Seeger, “PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classiﬁcation,” J. Mach. Learn. Res., vol. 3, pp. 233–269, 2002. - T. Suzuki, “PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additive Model,” in COLT, 2012. - Y. Tsuzuku, I. Sato, and M. Sugiyama, “Normalized Flat Minima: Exploring Scale Invariant Deﬁnition of Flat Minima for Neural Networks Using PAC- Bayesian Analysis,” in ICML, 2020. - J. Yang, S. Sun, and D. M. Roy, “Fast-rate PAC-Bayes Generalization Bounds via Shi$ed Rademacher Processes,” in NeurIPS, 2019.