How can biological creatures so adeptly maneuver and interact in the real world, while synthetic creatures we create are so clumsy outside of structured environments? One of the key issues is our basic understanding of the boundaries (if at all) between sensorimotor processes and the external world. In my work, I question some of these boundaries. For example, I have found that looking at one object while moving (fixating) serves a high utility in tasks that involve object search and fetching. Here, perception and action (as traditionally defined) are highly coupled.
On the one hand, understanding boundaries is important. But on the other hand, being able to express them as behaviors as well as having a computational framework to process information coherently over time is important. To address this, I am also interested in understanding the principles that underlie robust information processing for perception-action streams. Currently, we do this by simultaneously analyzing visual phenomena in humans while producing computational models. We quickly iterate through analysis and synthesis (of models) of visual phenomena to define the characteristics of a robust information processing system for perception and action.
Oct 2019 - present
PhD Student, TU Berlin
Aug 2017 - May 2019
Master of Science, Georgia Institute of Technology
Sep 2009 - May 2013
Bachelor of Engineering, Siddaganga Institute of Technology
Personal website: https://oxidification.com/