My research is focused on understanding how to integrate computer interfaces with the human body—I believe this is the interface paradigm that supersedes wearable computing. My lab explores this by engineering interactive systems that intentionally borrow parts of the user’s body for input and output.
We have used our wearable muscle stimulation devices, for example, to make a user’s muscles properly manipulate a tool they never used before, computationally accelerate a user’s reaction time so they are able to take a photograph of a high-speed moving target, read and write information without using a screen, and transform someone’s arm into a plotter so they can solve computationally complex problems with just a pen and paper.
We think these types of interactive devices are beneficial because they afford new emotional and physical modes of reasoning with computers, which goes beyond just symbolic thinking (reasoning by typing and reading language on a screen). While this physical integration between human and computer is beneficial in many ways, it also requires tackling a series of new philosophical challenges, such as the question of agency: how do I feel when my body is integrated with an interface, do I feel in control? We explore these questions, together with neuroscientists, by measuring and improving how our brain encodes the feeling of agency under this new kind of integrated interfaces.