Random Walk以外の時系列変化を考慮 できていない > 実際にはニュースによる急な単語分布の変化もある • トピック⽐率の時間発展は考慮されていない > 実際にはトピック⽐率も変化する 1. Draw Topics 2. For each document: a. Draw b. For each word: i. Draw ii. Draw <latexit sha1_base64="MqkFLky9fjFuHW4p0+S5dCdEMzE=">AAACpHichVHLShxBFD22eZgxiaPZBNw0DooBHW5LJCGbSLJRkKBORgVHh+q2HAv7RXfNgOnMD+QHsshKQUTyGdkIbnXhJ4hLhWxceLunQzSS5DZdderce26dqrJDV8Wa6LTL6L53/8HDnkeF3sdPnvYV+wcW46AZObLqBG4QLdsilq7yZVUr7crlMJLCs125ZG+9T/NLLRnFKvA/6u1Qrnqi4asN5QjNVL34tmZLLeqJbn/+hcattlmLlWfWPKE3HeEmH9qjN5JjabbhibUJc+ZFvViiMmVh3gVWDkrIYy4o7qOGdQRw0IQHCR+asQuBmL8VWCCEzK0iYS5ipLK8RBsF1ja5SnKFYHaLxwavVnLW53XaM87UDu/i8h+x0sQwndABXdAhfaczuvprryTrkXrZ5tnuaGVY7/vyvPLzvyqPZ43N36p/etbYwOvMq2LvYcakp3A6+tanrxeVNwvDyQjt0jn736FT+sEn8FuXzt68XPiGAj+A9ed13wWLE2VrskzzL0tT7/Kn6MEghjDK9/0KU5jGHKq87x6OcIwTY8SYNSpGtVNqdOWaZ7gVxto1QnmiBw==</latexit> t | t 1 ⇠ N( t 1, 2I) <latexit sha1_base64="HEJnihpySL0CY4UlQjgjSDQ6AXw=">AAACinichVG7ThtBFD0sCRDzMtBESrOK5Qga6xqBeKRBCQUlLwMSi6zZZcAj70u7Y0tm5R8IH5CCikhRhFKlTco0+YEUfAKiBImGguv1SggQyR3NzJkz99w5M2OHroo10XmP0fviZV//wKvc4NDwyGh+bHwrDhqRIytO4AbRji1i6SpfVrTSrtwJIyk825Xbdv1jZ3+7KaNYBf6mboVyzxOHvjpQjtBMVfNFS9ekFtVkv21asfJMyxO6FnnJsora5qQl3LAmpqr5ApUoDfMpKGeggCxWg/w3WNhHAAcNeJDwoRm7EIi57aIMQsjcHhLmIkYq3ZdoI8faBmdJzhDM1nk85NVuxvq87tSMU7XDp7jcI1aaKNJfOqMr+kPf6YJun62VpDU6Xlo8212tDKujn15v3PxX5fGsUbtX/dOzxgHmU6+KvYcp07mF09U3jz5fbSyuF5N39IUu2f8pndNvvoHfvHa+rsn1E+T4A8qPn/sp2JoulWdLtDZTWPqQfcUA3uAtJvm957CEFayiwuce4wd+4pcxZEwbC8b7bqrRk2km8CCM5Tt2Jpf/</latexit> ✓d ⇠ Dir(↵) <latexit sha1_base64="gK9AalcZoXuQUVURcXUPDw+j2Y0=">AAAChXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQsoRCnEFGfmKsTkJpZkFOVW+5bmlNRqxJRkpJYkxlen1GrGCygb6BmAgQImwxDKUGaAgoB8geUMMQwpDPkMyQylDLkMqQx5DCVAdg5DIkMxEEYzGDIYMBQAxWIZqoFiRUBWJlg+laGWgQuotxSoKhWoIhEomg0k04G8aKhoHpAPMrMYrDsZaEsOEBcBdSowqBpcNVhp8NnghMFqg5cGf3CaVQ02A+SWSiCdBNGbWhDP3yUR/J2grlwgXcKQgdCF180lDGkMFmC3ZgLdXgAWAfkiGaK/rGr652CrINVqNYNFBq+B7l9ocNPgMNAHeWVfkpcGpgbNZuACRoAhenBjMsKM9AxN9QwCTZQdnKBRwcEgzaDEoAEMb3MGBwYPhgCGUKC9bQxrGLYybGNiZ9JlMmEygyhlYoTqEWZAAUz2ACa2lig=</latexit> Z ⇠ Mult(✓d) <latexit sha1_base64="raX3wLoIAgN1O4/oVsGajqMFyyg=">AAAConichVHLShxBFD12XjpqnCQbwU3jqCjIcEcSlEBAdCNCwHEcR3BkqG5rtLBfdNcM0WZ+ID+QhSsTQpB8RjbJVnHhJ4hLA9m48HZPB0nE5DZdde6pe26dqrICR0Wa6LzHePDw0eMnvX25/oHBp0P5Z8/XI78V2rJq+44fblgiko7yZFUr7ciNIJTCtRxZs/YWk/VaW4aR8r01vR/ILVfseKqpbKGZauTf1Bqxnja3p02vY9Yj5Zp1V+jd0I3fthzdmfydRX5Tu+IdE5bUItEcdKamGvkCFSkN8y4oZaCALFb8/BfUsQ0fNlpwIeFBM3YgEPG3iRIIAXNbiJkLGal0XaKDHGtbXCW5QjC7x+MOZ5sZ63Ge9IxStc27OPyHrDQxTmd0TFf0nb7SBV3f2ytOeyRe9nm2uloZNIbeD1d+/Vfl8qyxe6v6p2eNJuZSr4q9BymTnMLu6tsHH64qr1fH4wn6SJfs/4jO6RufwGv/tD+X5eohcvwApb+v+y5YnymWXhWp/LIwv5A9RS9GMIpJvu9ZzGMJK6jyvp/wAyc4NcaMZaNsVLqlRk+meYE/wqjfAJKPoeg=</latexit> Wt,d,n ⇠ Mult(softmax( t,z)) 課題
John D. Lafferty (2006). Dynamic topic models. In International Conference on Machine Learning. • Patrick Jähnichen, Florian Wenzel, Marius Kloft, Stephan Mandt (2018). Scalable Generalized Topic Models. In International Conference on Artificial Intelligence and Statistics. • Adji B. Dieng, Francisco J. R. Ruiz, David M. Blei (2019). The Dynamic Embedded Topic Models. • Lau, J. H., Newman, D., and Baldwin, T. (2014). Machine reading tea leaves: Automatically evaluating topic coherence and topic model quality. In Conference of the European Chapter of the Association for Computational Linguistics. • Adji B. Dieng, Francisco J. R. Ruiz and David M. Blei. (2020) Topic Modeling in Embedding Spaces. In Transactions of the Association for Computational Linguistics. • Hanna M. Wallach, David M. Mimno, Andrew McCallum. (2009) Rethinking LDA: Why Priors Matter. In Neural Information Processing Systems.