I am a Washington Research Foundation Postdoctoral Fellow in the lab of Nick Steinmetz at University of Washington. Recently I have been spending a lot of time skiing volcanoes!
I am now leading the Virtual Brain Lab a project to create intuitive and interactive 3D visualizations of neuroscience data.
In the past I worked on building linking models to connect physiological measurements to perceptual decisions.
© Dan Birman 2015-Present . Code
As a graduate student working with Justin Gardner one of my major projects has been to look at how we can use motion visibility to understand how the brain "reads out" from sensory representations during perceptual decision making.
The first paper focuses on how changes in sensory representation are insufficient to explain changes in human behavior during a "feature-based attention" task.
The second paper is a framework for modeling how visual cortex responds to motion visibility.
with Justin Gardner at Gardner Lab; Code; Stim; Data
In 2013 I lived in Berlin and worked in the lab of John-Dylan Haynes on this project.
We have an intuition that we "commit" to a decision at a specific moment. Despite this intuition early neuroscience researchers found that brain activity becomes predictive of our intentions far in advance, sometimes up to 10 seconds. In this experiment we showed that in reality the point of no return, after which an action is guaranteed to happen, occurs only about 200 ms before motor activity. Until the point of no return the brain has not committed with no possibility of cancelling.
Birman, D.*, Schultze-Kraft, M.*, Rusconi, M., Allefeld, C., Görgen, K., Dähne, S., ... & Haynes, J. D. (2015). The point of no return in vetoing self-initiated movements. Proceedings of the National Academy of Sciences, 201513569. *Equal author contribution. at hayneslab
Understanding human physiology using model systems is my main research goal. Convolutional neural networks have proven to be a particularly good model of the primate visual system. I think this is very intriguing because it allows us to test out "impossible" experiments in a CNN and test theories that are untestable in the human brain. In this project we do exactly that and find that the classic idea that spatial attention works by pulling together receptive fields at the locus of attention is implausible.
Fox, K. J.*, Birman, D.*, & Gardner, J. L. (2022). Behavioral benefits of spatial attention explained by multiplicative gain, not receptive field shifts, in a neural network model. Preprint. at Gardner Lab.
I am very interested in the idea of building better models to link human and non-human animal behavior as a way of better connecting physiological data from different model systems. We got interested in this project while looking at a set of monkey physiology data where the order of task training turned out to be critical to interpreting the results.
Birman, D., & Gardner, J. L. (2016). Parietal and prefrontal: categorical differences?. Nature neuroscience, 19(1), 5-7. at Gardner Lab.
Brain is a playful learning environment I have been working on. A demo version is available:
Instructors: Dan Birman, Corey Fernandez
Instructors: Dan Birman, Corey Fernandez
Instructor: Justin Gardner
Head TA: Dan Birman
2019 TAs: Akshay Jagadeesh, Minyoung Lee, Jon Walters, Ian Eisenberg, Josiah Leong, Kawena Hirayama; Undergraduate TAs: Megumi Sano, Graham Todd, Greg Weaver, Vinh Ton, Kendall Costello, Michael Ko
2018 TAs: Akshay Jagadeesh, Minyoung Lee, Guillaume Riesen, Jon Walters; Undergraduate TAs: Emma Master, Stephanie Zhang, Kawena Hirayama, Henry Ingram, Storm Foley
2017 TAs: Minyoung Lee, Lior Bugatus, Zeynep Enkavi, Mona Rosenke, Guillaume Riesen
2016 TAs: Anna Khazenzon, Natalia Velez, Anthony Stigliani, Rosemary Le
Instructors: Russ Poldrack, Justin Gardner
TA: Dan Birman
Instructors: Ewart Thomas, Benoit Monin
TAs: Dan Birman, Stephanie Gagnon, Robert Hawkins