Starting fall 2019 I will be a postdoc at University of Washington in the lab of Nick Steinmetz.
I am currently a PhD student at Stanford University in the lab of Justin Gardner studying how perception and action are linked to sensory representations in the brain.
In my research I build linking models which connect physiological measurements to perceptual decisions. I use these models to test different theories of how readout occurs: for example, when you go from walking out your front door to driving your car does your brain make changes in your sensory representations to allow you to flexibly behave? Or do you leave the sensory system alone and make those changes somewhere else?
© Dan Birman 2015-Present . Code
As a graduate student working with Justin Gardner one of my major projects has been to look at how we can use motion visibility to understand how the brain "reads out" from sensory representations during perceptual decision making.
The first paper focuses on how changes in sensory representation are insufficient to explain changes in human behavior during a "feature-based attention" task.
The second paper is a framework for modeling how visual cortex responds to motion visibility.
In 2013 I lived in Berlin and worked in the lab of John-Dylan Haynes on this project.
We have an intuition that we "commit" to a decision at a specific moment. Despite this intuition early neuroscience researchers found that brain activity becomes predictive of our intentions far in advance, sometimes up to 10 seconds. In this experiment we showed that in reality the point of no return, after which an action is guaranteed to happen, occurs only about 200 ms before motor activity. Until the point of no return the brain has not committed with no possibility of cancelling.
Birman, D.*, Schultze-Kraft, M.*, Rusconi, M., Allefeld, C., Görgen, K., Dähne, S., ... & Haynes, J. D. (2015). The point of no return in vetoing self-initiated movements. Proceedings of the National Academy of Sciences, 201513569. *Equal author contribution. at hayneslab
I am very interested in the idea of building better models to link human and non-human animal behavior as a way of better connecting physiological data from different model systems. We got interested in this project while looking at a set of monkey physiology data where the order of task training turned out to be critical to interpreting the results.
Birman, D., & Gardner, J. L. (2016). Parietal and prefrontal: categorical differences?. Nature neuroscience, 19(1), 5-7. at Gardner Lab.
Let's face it. We're awful at teaching cognitive neuroscience! When you learn about physics you get to play with kinetics, electricity, magnetism, and thermodynamics. When you learn about chemistry you get to blow things up! But for some reason we seem to think that students will get excited if we show them pictures and movies about psychology and call it a day.
In my time at Stanford I've invested a lot of work into building playful brain simulation tutorials about cognitive neuroscience. These tutorials help students re-discover classic experiments about the visual system. Brain (below) is the new interface to all of these -- but I keep a list of older tutorials here.
Brain is a playful learning environment I have been working on. A demo version is available:
Instructors: Dan Birman, Corey Fernandez
Instructor: Justin Gardner
Head TA: Dan Birman
2019 TAs: Akshay Jagadeesh, Minyoung Lee, Jon Walters, Ian Eisenberg, Josiah Leong, Kawena Hirayama; Undergraduate TAs: Megumi Sano, Graham Todd, Greg Weaver, Vinh Ton, Kendall Costello, Michael Ko
2018 TAs: Akshay Jagadeesh, Minyoung Lee, Guillaume Riesen, Jon Walters; Undergraduate TAs: Emma Master, Stephanie Zhang, Kawena Hirayama, Henry Ingram, Storm Foley
2017 TAs: Minyoung Lee, Lior Bugatus, Zeynep Enkavi, Mona Rosenke, Guillaume Riesen
2016 TAs: Anna Khazenzon, Natalia Velez, Anthony Stigliani, Rosemary Le
Instructors: Russ Poldrack, Justin Gardner
TA: Dan Birman
Instructors: Ewart Thomas, Benoit Monin
TAs: Dan Birman, Stephanie Gagnon, Robert Hawkins