top of page

I've spent the last 70 weeks sampling different research areas in cognitive science, computer science, and statistics. For 4 to 8 week increments, I would investigate a particular paper that I knew nothing about. I would read about the authors and about related papers. I would try to piece together how the particular paper fits into a local neighborhood of papers. I would form expectations about what I should learn next, and then I would memorize ideas that I thought would empower my search, only to move on to another paper and repeat. To have something to show for this exploration, I recorded myself reciting the ideas from memory on video. These videos are below, organized chronologically, with most recent topics I've explored at the top of the page.

I arbitrarily decided to let the memory palace provide me with an initial orientation. If nothing fruitful ever comes from thinking about memory palaces containing abstract concepts, then at least I've ruminated on ideas from various areas of science, I've had a lot of fun, and my research interests now stem from my own curiosities. How do we explore when we are a 4-year-old, or a naive 24-year-old, and do not know what we do not know? How do humans bootstrap their knowledge into a new area of study? What are the phenomena associated with ruminating on sets of ideas? How and what does the brain implicitly decide to memorize? What is the fabric of my world model and how do I perturb it in order to see more of the conceptual landscape?

  • 70 ideas from “Meta-Learned Models of Cognition” (2023) by Marcel Binz, Ishita Dasgupta, Akshay Jagadish, Matt Botvinick, Jane Wang, and Eric Schulz

  • 99 ideas from “Learning Outside the Brain: Integrating Cognitive Science and Systems Biology” (2022) by Jeremy Gunawardena

  • 41 ideas from “Play, Curiosity, and Cognition” (2022) by Junyi Chu and Laura E. Schulz

  • 193 ideas from “Amortized Variational Inference: Towards a Mathematical Foundation and Review” (2022) by Ankush Ganguly, Sanjana Jain, Ukrit Watchareeruetai 

  • 121 ideas from “Abstraction and Analogy-Making in Artificial Intelligence” (2021) by Melanie Mitchell

  • 65 ideas from “Scene Representation Networks: Continuous 3D-Structure-Aware Neural Scene Representations” (2019) by Vincent Sitzmann, Michael Zollhöfer, Gordon Wetzstein

  • 41 ideas from “Deconstructing Episodic Memory with Construction” (2007) by Demis Hassabis and Eleanor Maguire

  • 122 ideas from “State of the Art on Neural Rendering” (2020) by Ayush Tewari et al. 

  • 60 ideas from “Compositional Inductive Biases in Function Learning” (2017) by Eric Schulz, Joshua Tenenbaum, David Duvenaud, Maarten Speekenbrink, Samuel Gershman

  • 60 ideas from “Memory as a Computational Resource” (2020) by Samuel Gershman and Ishita Dasgupta

  • All definitions and results from Chapter 7 of “Abstract Algebra” by David Dummit and Richard Foote

bottom of page