top of page

June - July 2023

For about 8 weeks, I latched onto David Duvenaud's statistical machine learning class materials as well as this review on amortized variational inference. Fortunately I took two Bayesian statistics classes as electives during my master's. By diving into the details of AVI, I wanted to unlock papers in variety of applied areas such as those linked here: visual working memory, generative modeling, scene representations, structural biology, and probabilistic reasoning.​​ I am particularly​​​​​​​​​​​​ interested in work within compositional learning as described further in my research statements.​​​​​​​​​​​

​

- "Amortized inference uses a stochastic function to estimate the true posterior. The parameters of this stochastic function are fixed and shared across all data points, thereby amortizing the inference."​

- "Generally, the distance between points in the latent space in a variational autoencoder does not reflect the true similarity of corresponding points in the observation space."​​​​​​​​​

bottom of page